Job Description
Key Responsibilities
- Develop and maintain web crawlers and data scraping scripts to extract and process data from various sources, including financial platforms and market data feeds.
 - Collaborate with the team to design and implement new features, contributing to the evolution of the trading platform while maintaining system stability and scalability.
 - Optimize the performance of trading systems through efficient data handling and processing techniques, ensuring real-time responsiveness and minimal latency.
 - Continuously improve system performance by identifying bottlenecks and implementing solutions, such as algorithmic enhancements or infrastructure upgrades.
 - Analyze collected data to derive actionable insights and support decision-making processes, while maintaining data integrity and reliability.
 
Job Requirements
- Proficient in Python programming, with a strong understanding of asynchronous programming using libraries such as asyncio or aiohttp to handle high-throughput tasks.
 - Familiar with key performance optimization techniques, including caching strategies, database indexing, and code-level efficiency improvements for large-scale applications.
 - Experience with data collection, maintenance, and analysis of trading systems, requiring expertise in data pipelines, ETL processes, and data storage solutions.
 - Strong problem-solving skills and the ability to work independently on complex tasks, including debugging, system architecture design, and performance benchmarking.
 - Excellent communication skills to collaborate effectively with team members and stakeholders, while documenting code and sharing technical insights.
 - Preferred: Knowledge of distributed systems, cloud computing platforms (e.g., AWS, Azure), and financial market data protocols (e.g., FIX, WebSocket) to enhance system capabilities.
 


