• Share this Job

Big Data Engineer / Developer

Location : Phila., PA
Job Type : Temp/Contract to Direct
Reference Code : 596
Hours : Full Time
Required Years of Experience : 2+ years
Required Education : BS
Travel : No
Relocation : No

Job Description :

SUMMARY


Analytics Data Engineer needed with experience working with large-scale and distributed data pipelines. The position will help create our next-generation analytics platform and the responsibilities span the full engineering lifecycle from architecture and design, data analysis, software development, QA, release and operations support.


Work as a member of a dedicated DevOps team tasked with building and operating the analytics platform. The Analytics Engineer will work closely with a team of data analysts/scientists.


 


RESPONSIBILITIES: 


· Create and support an analytics infrastructure to support high-volume, high-velocity data pipelines


· Analyze massive amounts of data both real-time and batch processing


· Prototype ideas for new tools, products and services


· Ensure a quality transition to production and solid production operation of analytics and ETL jobs


· Help automate and streamline our operations and processes


· Troubleshoot and resolve issues in our environments as they arrive


· Develop and test data integration components to high standards of quality and performance


· Lead code reviews 


·  Assist with planning and executing releases of data pipeline components into production


· Troubleshoot and resolve critical production failures in the data pipeline


· Research, identify and recommend technical and operational improvements that may result in improved reliability, efficiency and maintenance of the analytics pipeline


· Evaluate and advise on technical aspects of work requests in the product backlog  


Required Qualifications :

REQUIREMENTS


· 2+ years development experience working in Java/Scala


· Strong experience using ETL tools or other data transformation tools


· 2 years’ experience in the analytics sphere and working with distributed compute frameworks.


· Experience working with various data sources and data sets 


· Experience working in AWS environment


· Adaptable, proactive and willing to take ownership


· Good communication skills, ability to analyze and clearly articulate complex issues and technologies


· Bachelor's Degree or higher in Comp. Science or related field


· Knowledge of the following technologies a plus: Java, Scala, Spark, Storm, Kafka, Kinesis, Avro


Powered by AkkenCloud