Forbes Advisor Logo

Forbes Advisor

Data Research Engineer - Data Extraction (India - Remote)

Posted 13 Days Ago
Remote
Hiring Remotely in Mumbai, Maharashtra
Entry level
Remote
Hiring Remotely in Mumbai, Maharashtra
Entry level
As a Data Research Engineer, you will develop methods for data quality assurance, implement validation rules, and acquire data using web crawling and API integration. You'll develop Python scripts for ETL processes, maintain data processing workflows, and ensure data quality. Collaboration with cross-functional teams and staying updated with emerging technologies are also key responsibilities.
The summary above was generated by AI

Company Description

Forbes Advisor is looking for a Data Research Engineer - Data Extraction to join the Forbes Marketplace Performance Marketing team with a focus on supporting one of Forbes business verticals. If you're looking for challenges and opportunities similar to those of a start-up, with the benefits of an established, successful company read on.

We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel.

The Data Extraction Team is a brand new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization.

A typical day in the life of a Data Research Engineer will involve acquiring and integrating data from various sources, developing and maintaining data processing workflows, and ensuring data quality and reliability. They collaborate with the team to identify effective data acquisition strategies and develop Python scripts for data extraction, transformation, and loading processes. They also contribute to data validation, cleansing, and quality checks. The Data Research Engineer stays updated with emerging data engineering technologies and best practices.


Job Description

Responsibilities

  • Develop methods and processes for data quality assurance (QA) to ensure accuracy, completeness, and integrity.
  • Define and implement data validation rules and automated data quality checks.
  • Perform data profiling and analysis to identify anomalies, outliers, and inconsistencies.
  • Assist in acquiring and integrating data from various sources, including web crawling and API integration.
  • Develop and maintain scripts in Python for data extraction, transformation, and loading (ETL) processes.
  • Stay updated with emerging technologies and industry trends.
  • Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines.
  • Contribute to cross-functional teams in understanding data requirements.
  • Assume accountability for achieving development milestones.
  • Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities.
  • Collaborate with and assist fellow members of the Data Research Engineering Team as required.
  • Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations.

Skills and Experience

  • Bachelor's degree in Computer Science, Data Science, or a related field. 
  • Strong proficiency in Python programming for data extraction, transformation, and loading.
  • Proficiency in SQL and data querying is a plus.
  • Knowledge of Python modules such as Pandas, SQLAlchemy, gspread, PyDrive, BeautifulSoup and Selenium, sklearn, Plotly.
  • Knowledge of web crawling techniques and API integration.
  • Knowledge of data quality assurance methodologies and techniques.
  • Familiarity with machine learning concepts and techniques.
  • Familiarity with HTML, CSS, JavaScript.
  • Familiarity with Agile development methodologies is a plus.
  • Strong problem-solving and analytical skills with attention to detail.
  • Creative and critical thinking.
  • Ability to work collaboratively in a team environment.
  • Good and effective communication skills.
  • Experience with version control systems, such as Git, for collaborative development.
  • Ability to thrive in a fast-paced environment with rapidly changing priorities.
  • Comfortable with autonomy and ability to work independently.

Perks:
● Day off on the 3rd Friday of every month (one long weekend each month)
● Monthly Wellness Reimbursement Program to promote health well-being
● Monthly Office Commutation Reimbursement Program
● Paid paternity and maternity leaves
● Group Medical Insurance
● Group Term Life Insurance (2.5X of the CTC)
● Group Personal Accident Insurance (3 X of the CTC)

Qualifications

Bachelor's degree in Computer Science, Data Science, or a related field.

Top Skills

Python
SQL

Similar Jobs

Yesterday
Remote
India
Senior level
Senior level
Software
As the Data Science & Engineering Lead, you will guide a team in developing and deploying machine learning models, optimizing data systems, and providing actionable insights through AI advancements. Responsibilities include model architecture, MLOps pipelines, and mentoring junior members while utilizing cutting-edge technology to drive innovation.
Top Skills: PythonSQL
23 Hours Ago
Remote
India
Mid level
Mid level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
The Data Scientist will develop and implement AI models to solve complex problems, collaborating with stakeholders to convert business problems into analytical use-cases. The role involves data exploration, machine learning model development, driving innovation, and ensuring scalable analytics solutions, while adhering to best practices in coding and continuous learning.
Top Skills: PysparkPythonR
23 Hours Ago
Remote
India
Entry level
Entry level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
The Associate Big Data Engineer will support the development and deployment of data ingestion processes and ETL code using advanced big data tools. Responsibilities include collaborating with cross-functional teams, optimizing data frameworks, ensuring data security, and operationalizing machine learning solutions under guidance from senior team members.
Top Skills: PythonScala

What you need to know about the Calgary Tech Scene

Employees can spend up to one-third of their life at work, so choosing the right company is crucial, not just for the job itself but for the company culture as well. While startups often offer dynamic culture and growth opportunities, large corporations provide benefits like career development and networking, especially appealing to recent graduates. Fortunately, Calgary stands out as a hub for both, recognized as one of Startup Genome's Top 100 Emerging Ecosystems, while also playing host to a number of multinational enterprises. In Calgary, job seekers can find a wide range of opportunities.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account