Data Integration Developer

Job ID: 110385
Location: Seattle, WA  [Flex]
Salary: $70.00 - $75.00 per hour
Category: App/Dev
Employment Type: Contract

Apply Now

Fill out the form below to submit your information for this opportunity. Please upload your resume as a doc, pdf, rtf or txt file. Your information will be processed as soon as possible.


 
 
 
 
 
(Word, PDF, RTF, TXT)
* Required field.

Job Description:
Our client is seeking a highly skilled Automation & Data Integration Developer who excels in PowerShell, Python, and working with RESTful APIs. This role involves building and maintaining automated solutions that pull and process data from major platforms such as Google Workspace, Microsoft 365, and Slack. The ideal candidate will have strong experience in scripting, data collection, and writing complex queries to extract meaningful insights from various systems. This position is critical in automating manual workflows and enhancing data accessibility across the organization.

Responsibilities:

  • Design, develop, and maintain scripts in PowerShell and Python for automation and data extraction.
  • Integrate with third-party APIs (Google, Microsoft, Slack, etc.) using RESTful calls to collect structured and unstructured data.
  • Write efficient and scalable queries for processing and analyzing collected data.
  • Automate manual workflows by building reliable and reusable code.
  • Develop internal dashboards or reporting tools to surface collected data where appropriate.
  • Troubleshoot API issues, maintain authentication/authorization flows (OAuth, tokens), and ensure data security best practices.
  • Document processes, scripts, and integration logic for internal stakeholders and team members.

Qualifications:
  • 5+ years of experience in scripting with PowerShell and Python.
  • Strong experience working with REST APIs including handling authentication, rate limits, and pagination.
  • Experience pulling and transforming data from Google Workspace APIs, Microsoft Graph APIs, and Slack APIs.
  • Proficiency in writing SQL or NoSQL queries for data extraction and manipulation.
  • Experience with logging, monitoring, and handling API errors and timeouts.
  • Familiarity with JSON, XML, and other data interchange formats.
  • Excellent problem-solving and debugging skills.
  • Strong written and verbal communication skills.
  • Experience with data warehousing or data pipeline tools (e.g., Airflow, dbt, Azure Data Factory) is preferred.
  • Exposure to cloud environments (e.g., GCP, Azure, AWS) is a plus.