« Back to Summary
Details
  • Location: New York NY
  • Type: Perm
  • Job #24921

Company Overview:
A global software investor with over $90 billion in assets under management is seeking an Enterprise AI Data & Integration Engineer to join their Business Systems & Data Engineering team. This role will focus on designing data pipelines, intelligent automations, and AI-powered analytics to enhance decision-making processes.

Job Responsibilities:

  • Build and maintain cross-system data integrations using low-code and programmatic frameworks (Workato, Power Automate, Python, SQL).
  • Develop AI-assisted analytics workflows using LLMs and agents to extract, enrich, and summarize data.
  • Design and orchestrate data pipelines blending structured and unstructured data sources.
  • Partner with engineering teams to automate testing, data validation, deployment, and monitoring of analytics workflows.
  • Collaborate with analysts and stakeholders to design automated solutions delivering measurable insights.
  • Prototype and productionize AI-driven data tools, including custom GPTs or Claude Skills.
  • Document, govern, and scale best practices for automation and data reliability.
  • Stay ahead of emerging tech in AI analytics, multi-agent systems, and data engineering.

Qualifications:

  • 7+ years of experience in data engineering, analytics automation, or systems integration.
  • Strong proficiency in SQL, Python, and/or low-code iPaaS tools (Workato, Power Automate).
  • Experience building data pipelines and ETL/ELT workflows, integrating multiple enterprise systems.
  • Solid understanding of APIs, webhooks, and system integration design patterns.
  • Bachelor’s degree in computer science, data engineering, information systems, or equivalent experience.

Preferred:

  • Experience using LLM frameworks (LangChain, CrewAI, Claude Skills, OpenAI API).
  • Familiarity with data warehouses (Snowflake, Redshift, BigQuery) and BI tools (Looker, Power BI, Tableau).
  • Hands-on experience with analytics engineering practices (dbt, Airflow, CI/CD for data pipelines).
  • Understanding of agent architectures and multi-agent orchestration frameworks.
  • Experience in financial services, PE/VC, or other data-rich, high-velocity environments.
  • Exposure to data observability, monitoring, or quality frameworks.

Compensation:
Salary: $150,000 – $240,000, Plus Bonus
 

Click to upload or drag and drop
DOC, DOCX, PDF, HTML, or TXT (max. 800x400px)

We are uploading your application. It may take a few moments to read your resume. Please wait!

Apply Now
Refer this job
Know someone who would be a great fit?
Let them know!
icon icon icon