Replatforming a data-modeling application in AWS gives energy traders and analysts quicker access to deal-making insights
In the trading and investment industry, speedy access to data is key. This has always been a competitive advantage for our client: The commodities merchant and infrastructure asset investor deals in a range of energy markets. The company’s research capabilities and emphasis on innovation have been pivotal to their ongoing success. However, in a world where access to data can make or break a transaction, the risk of slowed performance looms heavy.
The company’s traders and analysts had relied on an internal application that leverages a key analytical model to help them establish priorities. In order to keep employees equipped with the kind of insights they need at speed, they decided to move parts of their application to a more agile and affordable cloud platform. Their IT team had limited experience in cloud technologies, so they turned to Headspring as a partner to help launch this business-critical effort. We leveraged our cloud expertise to help our client replatform their application in the cloud—and in record time, too!
Let's work together. Connect with us today.
- Local, on-prem installations of the application were costly and cumbersome—the company could generate more value in the cloud
- It was taking several hours to run a certain number of scenarios; their goal was to run even more scenarios in just minutes
- They also needed to keep costs for running multiple scenarios down while enabling scale at a predictable rate
- Needed to be able to port multiple streams of data into the Snowflake data warehousing tool at once
- This was an important cloud re-platforming effort, with lots of eyes monitoring progress, so a smooth transition was imperative
- Access to cloud resources with input and output data needed to comply with the company’s security requirements
- Refactored and containerized logic and moved it to a cloud environment where it can run in parallel on a serverless architecture
- Worked with the client’s team to determine which aspects of the workflow to move while maintaining control over input data generation
- Architected the system to run multiple data streams in Snowflake and developed a schema for storing the data
- Rewrote and Dockerized the application to accept arguments that specify which scenario to run
- Created an Apache Airflow dashboard for kicking off and monitoring tasks; Used cloud resources tools which enabled our client to cost-effectively configure and run multiple instances of the application with different scenarios
- Trained the client’s team on CI/CD best practices and created a run book to support ongoing maintenance
- The application runs up to 24 times faster after moving containerized logic to serverless architecture
- Scenario runtime has decreased 96%, dropping from several hours to just minutes
- The codebase has been upgraded to the latest version and new code efficiencies established
- The solution was delivered well within the timeline, allowing greater focus on a collaborative and thorough knowledge transfer
- The client’s own team now has more knowledge and best practices to keep innovating in the cloud