Current Job Openings
Data Engineer
Job Duties
- Assess and plan migration from Info-Works to GCP. Leverage GCP services like Cloud Storage and Compute Engine.
- Create Python packages for API integration with AD platforms.
- Ensure data consistency and integrity in extraction. Implement extraction and transformation tailored to APIs. Utilize libraries like `pandas` for data manipulation.
- Install and configure Apache Airflow for workflow orchestration. Develop Directed Acyclic Graphs (DAGs) for task dependencies.
- Schedule and automate ETL workflows using Airflow’s scheduler. Optimize resource usage on GCP to minimize costs.
- Utilize serverless GCP services for cost-efficient processing. Monitor workflow status and performance using Airflow’s tools.
- Integrate Airflow with logging solutions for centralized log management. Work under supervision. Travel and/or relocation to unanticipated client sites throughout USA is required.
Education
Master’s degree in Computer Science/Information Technology/IS/ Engineering (any) or closely related field.
Salary
$143,458 / per annum + benefits
Apply OnlineSoftware Developer
Job Duties
- Interact with technology teams to create necessary requirement documentation such as ETL/ELT mappings, interface specifications and Production Runbooks, Deployment Guides, Change Requests.
- Participate in collaborating with end users on data and reporting requirements, business objectives, and data analytics needs.
- Monitoring production jobs failures and lead the effort on Root Cause Analysis and working with various teams to resolve the issues.
- Utilize ServiceNow tool for ticketing purpose for the production failure and Change requests for the code deployment.
- Support production environment and debug issues using the Databricks Logs and Snowflake query profiling. Utilization of Jenkins, GitLab and CodeCloud for the runtime environment of the CI/CD system to build, Test, and Deploy the jobs into production.
- Implementation project using Databricks Delta lake, Snowflake, AWS S3, STARS and Python. Design and develop process to extract the data from Teradata DB onto S3 using PySpark. Work under supervision. Travel and/or relocation to unanticipated client sites throughout USA is required.
Education
Master’s degree in Computer Science/ IT/IS/Engineering (Any) or closely related field.
Salary
$163,966/ per annum + benefits
Apply OnlineSoftware Developer
Job Duties
- Conduct meticulous assessment of source system tables, performing comprehensive data profiling and analysis, crucial for ensuring data integrity and accuracy.
- Validate and map sourceto- target transformations, essential for the seamless development of ETL logic and maintaining data fidelity.
- Architect and design source and stage tables with attention to detail, incorporating robust audit and job logging mechanisms to safeguard data quality and compliance.
- Facilitate transparent communication and collaboration among team members and stakeholders, ensuring alignment on data analysis, preparation, and validation efforts. Lead rigorous design reviews with architects and peers, essential for optimizing data architecture and ensuring scalability and performance.
- Craft and optimize SQL code for efficient data transformations within Rhapsody IDE tool, for streamlining ETL processes and enhancing data processing efficiency. Drive the creation of impactful reports and visualizations using industry-leading tools like Telerik Designer, Power BI, and Tableau server, empowering stakeholders with actionable insights.
- Write complex python functions in TensorFlow to map the extracted data to the disease diagnosis present in the databases. Work under supervision. Travel and/or relocation to unanticipated client sites throughout USA is required.
Education
Bachelor’s degree in Computer Science/ IT/IS/Engineering (Any) or closely related field with 12 months of experience in the job offered or as an IT Consultant or Analyst or Programmer or Developer or related field.
Experience
Experience of 12 months of working with Python, Power BI and Tableau is required. Travel and/or relocation is required to unanticipated client sites within USA. International travel is not required. The frequency of travel is currently not known as it depends on the client and project requirement that cannot be currently anticipated. Employer provides Information technology services to various clients in USA and hence implementing projects will require such travel.
Salary
$143,458 / per annum + benefits
Apply OnlineSoftware Developer
Job Duties
- Participate in different phases of the project and assist technical team in planning and creating standards of the project for migration of Retirement applications.
- Participate in creation of in-house methodologies for SQL, INFORMATICA and Cobol jobs. Identify the complex logic and implement code in DB2 for validation and troubleshooting. Participate in creating Consolidated SQL for Informatica, Cobol and SAS and in Creating Service Accounts, Distribution groups and Access jobs based on migration strategy.
- Work on IIDR (IBM Replication Tool) for migration of data and applications on day-to-day basis. Work with AWS services like AWS Glue, S3 Bucket. Work with Databricks to perform ad-hoc data analysis. Perform data analysis in Snowflake by using SQL to identify missing source identifiers in the LANDING & PREPARED schema tables. Work under supervision. Travel and/or relocation to unanticipated client sites throughout USA is required.
Education
Master’s degree in Computer Science/Information Technology/IS/ Engineering (any) or closely related field.
Salary
$149,781/ per annum + benefits
Apply Online