CAIA's Career Center is an easy-to-use, comprehensive resource connecting job seekers with employers in the growing AI field. Use your knowledge and credibility to advance your career or build a talented team for your organization. Opportunities targeted to CAIA Charterholders are prioritized.
In order to search for jobs specifically for CAIA Charterholders or those pursuing the CAIA Charter please enter “CAIA” in the search panel.
This will enable you to search for CAIA specific roles globally.
A proponent of strong collaborative software engineering techniques and methods: agile development, continuous integration, code review or pairing, unit testing, refactoring and related approaches.
Excellent problem-solving and critical-thinking skills; demonstrated ability to employ fact-based decision-making to resolve complex problems, by applying logic analysis, experience and business knowledge
Possess a passion for technology and staying sharp in your craft by keeping on top of new technologies, tools and trends
Ensure and manage excellent customers relationships
Demonstrable passion for technology (e.g. personal projects, open-source involvement) while using their problem solving capabilities to deliver solutions utilizing a top end engineering approach
Engineer world-class products with maximum efficiency and agility
Enable improvement of the engineering team through shaping of tools, processes and standards
Interact with Quants and Analysts to understand their workflows and requirements
Collaborate with your engineering manager to enable a fit-for-purpose application portfolio consistent with the target architecture and operating model
Produce comprehensive, usable dataset documentation and metadata
Evaluate and make decisions around dataset implementations designed and proposed by peer data engineers
Ensure excellent customers relationships
BS degree in Computer Science, Applied Mathematics, related field, or equivalent practical experience
7+ years of progressive engineering experience with 3+ years in Data Engineering
Minimum of 2 years designing and building large scale data loading, manipulation, processing, analysis, blending and exploration solutions using emerging technologies such as Hadoop Ecosystem (MapReduce, Hive, HBase, Spark, Sqoop, Flume, Pig, Kafka etc.), NoSQL(e.g. Cassandra, MongoDB), and In-Memory Data Technologies
Minimum of 3 year of experience in preparing and refining data sets and derived data in various emerging (Python, Scala, Spark) and traditional tools (Trifacta, Alteryx, Informatica, SQL)
Experience in extracting, aggregating and structuring large data sets along different dimensions (e.g. position/ instrument/ counterparty level)
Strong interpersonal skills; able to establish and maintain a close working relationship with quantitative researchers, analysts, traders and senior business people alike
Able to thrive in a fast-paced, high energy, demanding and team-oriented environment
Advanced Degree in one or more of the following disciplines: Computer Science, Mathematics, Statistics, Economics, Computing, Quantitative Finance or other Quantitative, Numerical or Computing discipline
Experience building containerized applications and deploying to public or private clouds, such as Amazon Web Services (AWS), Microsoft Azure, or similar providers.
Open source involvement such as a well-curated blog, accepted contribution, or community presence
Proficient with a range of open source frameworks and development tools – e.g. Angular, Node, SpringBoot, .Net Core, Flask, NumPy, SciPy, Pandas etc.
A solid understanding of financial markets and instruments
Experience of front office software development with an Asset Management, Hedge fund or Investment Bank