Engineering is a key driver of innovation and development. Our mission is to bring engineering to the world of finance. Therefore, we apply scientific principles to solve practical financial problems. We manage the allocation of financial funds, report on risks and draw insights from market behaviors.
In order to achieve our goals we build teams that harness creativity, expertise and dedication. We believe in the power of interdisciplinary exchange, critical thinking and permanent learning. Our disciplines span the fields of applied mathematics, data science, computer science and economics. Knowledge is everything!
Our core business is to build software. We draw experience from leading large enterprise scale implementations, spanning the stages of architecture design, implementation, and operations. Yet, we also work with people who make hands on analyses and quick design studies. We believe that the best architectures are built from early prototyping, iterating and improving.
Our biggest strength is quality and attention to detail. We achieve this through a high level of scrutiny during testing and review. We apply state-of-the-art techniques for test automation, test management and manual reviews during all stages of software development. We aim for the highest standards all the time.
As engineers we start every project with an assessment of goals and available resources. We build early prototypes, make design studies and perform market analyses. Our deep understanding of finance, mathematics and technology helps us to bridge the gap between various stakeholders and allows us to find effective solutions to complicated problems. We put strong emphasis on early validation with mockups, minimal viable products and prototypes.
As engineers we build every product according to highest standards and with highly-skilled personnel. We prefer to work in agile and lean settings with frequent iterations and constant feedback. Our value chains usually comprises the entire sequence from data sourcing, data processing, infrastructure deployment and front-end design. Our systems run on modern clouds and utilise state of the art services for processing, storage and distribution.
As engineers we strive for highest quality standards. We exercise permanent control over a wide range of quality measures. We test running software artefacts, analyse static code for quality and monitor the effectiveness of the overall system. We have experts in test automation and test management.
Handling and analyzing market data belongs to our long standing core expertises. With our long history in the field of finance we have accumulated a number of tools and tricks to deal with market data artefacts, trading conventions and special data sources. Unfortunately, there is much less standardization in the quotation and formatting of prices, volatilities and correlations than one would wish for. Aggregating this data into matrices, surfaces and model parameters is part of our daily business.
With an increasing fraction of business happening online we are starting to generate much more customer specific data. These include customer behavior, such as retention and conversion. With variation of advertised pricing and product parameters it is easier than ever before to monitor changing customer demands. To generate such data the passive role of market observation is extended with the design of tests that generate meaningful results.
With our background in mathematical modelling we apply our knowledge of statistics and model building to all fields of financial data. We put strong focus on test design and analysis to be executed with the required amount of rigor to ensure statistical validity. We believe that results must not only be visually appealing and transport the message in an intuitive fashion. We think it is more important that results are reliable and correct.
The size of data makes it increasingly difficult to perform computations and analyses on premise. As engineers we are interested in systems that work efficiently and reliably. Cloud services provide often provide the only viable solution to scale a business rapidly while still remaining open to changes in the business model and its computation requirements.
With our data-centric background we have a special focus on the deployment of data processing services. Out-of-the-box solutions for machine learning and cognitive processing are becoming increasingly attractive. The work of our architects shifts more and more away from connecting software libraries and data formats and shifts towards configuring cloud services and data pipelines.
Monitoring resource use, costs and security is another aspect of our work. With the scale of the physical machinery that is activated during the computational process there is an equally growing demand for oversight. We make sure that resource efficiency are monitored along side the consequent use of security policies, encryption and data protection.
We always put strong emphasis on quality. We thoroughly test and review everything we build. This devotion to accuracy is part of our engineering philosophy. Without accuracy there is no value in data analysis. Therefore we focus on testing and quality metrics right from the beginning of our projects.
Test planning and test management varies widely with the type of project that requires quality guarantees. (See our case study.) Any late discovery of defects always indicates a potential flaw in the process. It starts with a flaw in the software, but ends with a hole in the quality process. We believe that quality can only improve if both holes are fixed with equal attention.
Our main expertise lies in end-to-end front end testing as well as in data driven testing. Front end test automation becomes effective with a deep understanding of machine learning, such that click flows can be trained with the least amount of custom code. We do employ image recognition and understanding to extract relevant features. Once the results of outputs become too big for manual inspection and too varied for simple comparison then we come back to our favorite discipline of data analysis and statistics.
is COO of Thetaris GmbH. He has worked in the financial industry for several years, including trading and risk management for a bank and asset-liability management for an insurance company. He is the architect of ThetaML and Thetaris’ testing application. His clear focus on measurable progress boosts business value quickly and sustainably.
Any IT application requires testing before a successful Go-Live. The automation of these tasks allows the inexpensive repetition and continuous monitoring of functionalities and has be-come commonplace. It is an integral part of modern Continuous Integration and Continuous Delivery (CI/CD). As costs for implementation and adaptation are the driving factors in decid-ing whether to automate testing, this will be the focus of this paper.
Testing has a bad reputation among developers, because it is often associated with dull and repet-itive activities. Testing is associated with little room for creativity. Any deviation from a given procedure may trigger an unintended response or spoil the test result. Nobody really enjoys such tasks.
The use of Theta Suite was optimal to model different insurance product designs as well as hedging instruments and strategies in a transparent way. It allowed the customer to quickly develop an innovative and attractive insurance product. Gaining insight into the risk profile and the effects of hedging enabled the customer to secure profits.