Data/Research Analyst (Python/SQL/Sales)

Company: Techtriad Team - T3
Location: Oakland, California, United States
Type: Full-time
Posted: 30.MAR.2021
< >


Title: Sales Research Analyst Location: Oakland, CA (Initially Remote but eventually candidate has to relocate once the COVID situation cal...


Title: Sales Research Analyst

Location: Oakland, CA (Initially Remote but eventually candidate has to relocate once the COVID situation calms down, Remote until August - at least; also, willing to accept candidates from Bay Area, Chicago, New York, or DC)

Duration: 6 - month contract

Position Summary:

The Sales Research Analyst Contractor will support the Media Client advertising sales team by building, troubleshooting, and updating our automated data feed pipeline for external vendors. This is a highly technical role that requires a unique and wide range of skills. This candidate should be detail oriented, highly adept at problem solving and an effective communicator. Training to be provided.

The data feed pipeline supports the Ad Effectiveness team in delivering log files to over three dozen 3rd party vendors. These files are then used to conduct ad measurement studies (brand lift, sales lift, location analysis, etc.). The demand for advertisers to measure the effectiveness of their campaigns on Media Client has created an insatiable demand which resulted in over 3,000+ studies in 2019. As scale and demand continue to grow, the Analyst role will play an integral part in the development and maintenance of our automated data feed pipeline.

Duties and Responsibilities:

  • Writing SQL queries specifically for 3rd-Party vendor exposure files: Write or rewrite custom queries that are specific to the dozen or so vendors we work with. This requires following the vendor's specs and ensuring that the output of the query matches precisely. Often times, this also involves going back and forth with the vendor b/c their requests aren't always spelled out properly. Additionally, this calls for very advanced SQL skills as sometimes they need to be written from scratch.

  • Create/modify automated data feed process for different jobs: The set up and structure of our ad products (direct sold, programmatic, SoundCloud, etc.) are constantly changing and evolving. We are always playing catch up to ensure our exposure files incorporate the latest changes. Contractor will be asked to account for these different changes in a proactive manner.

  • Troubleshooting exposure files: Invariably, there is something that will always break down. There could be a number of reasons why, particularly when we're working at the scale we do and with the number of vendors we have. We have a Slack channel where issues are brought up and addressed. Contractor will have to identify where the issue originated from and go in and fix it. This can often times be like finding a needle in a haystack. Contractor must have a high proficiency in SQL and Python to do this efficiently.

  • Optimizing data feed process files: With our migration to Google Cloud Platform, there is now increased scrutiny on the amount each department generates in processing costs. with the evolution of our business we will need to monitor costs and ensure our entire operation is optimized as much as possible.

Minimum Requirements:

  • 1-2 years' experience in an analytical role in a digital, tech and/or media environment
  • Advanced SQL skills
  • Intermediate Python

Technical Skills:

  • Advanced experience with different SQL frameworks (Hive/Hadoop, Presto, Big Query).
  • Python and Airflow framework.
  • Experience with cloud database is a plus.

- provided by Dice

Apply Now


Loader2 Processing ...