Data Engineer

CatchCompany


6 months ago

08/21/2019 10:21:23

Job type: Full-time

Hiring from: US only

Category: All others


The Catch Company is looking for a Data Engineer to support our Analytics team. The Analytics team is looking to grow and improve our analytics tech stack to enable smarter and faster business decisions, automated processes, and personalized customer experiences.

In this role, you will own the development and maintenance of our analytics tech stack and will be instrumental in identifying and implementing new technologies and tools to support our goals. While we already have a robust analytics / business intelligence ecosystem in place, we believe the right Data Engineer can push our team to become industry leaders in enabling smarter decision-making and personalizing our customer experience via new technologies and approaches. Our current tech stack includes: Redshift (warehouse), Fivetran/Stitch Data/custom pipelines (ETL), dbt (transformation), Looker (visualization), and a variety of other services that support one-off tools (e.g., Jupyter notebooks, Amazon EC2, etc.).

Additionally, we welcome both local (Chicago) and remote candidates for this role! Our analytics team is partially remote and our engineering team is fully remote. Travel is not required for interviews or the job itself.

What makes this a special opportunity:

  • You will have broad freedom to change and improve the way we do things as the only Data Engineer on the team

  • You will have the opportunity to be a thought leader when it comes to selecting new technologies; you will be responsible for identifying and implementing new tools and technologies

  • You will work with people who are eager to use data to improve our product offerings, our customer experience, and other key components of the business

  • We place a premium on building a great culture made up of great people

  • You will work with and learn from experienced leaders who have a track record of building successful companies

What you will do:

  • Own the maintenance and development of our analytics tech stack, including identifying and implementing new tools, managing utilization, and improving performance

  • Model and architect our data in a way that will scale with the increasingly complex ways we’re analyzing it

  • Re-structure our processes for ingesting and analyzing website event data to a) Capture more usable, relevant data and b) Use technologies like Spark that allow for faster data transformation

  • Build custom data pipelines that reliably provide clean, ready-to-analyze data and develop systems that monitor those pipelines to ensure their health

  • Work closely with our software engineers to identify new opportunities for data collection (with a focus on personalization/recommendation systems) and build the processes to make that data available in our data warehouse

  • Identify use cases for real-time/streaming analytics and select and implement tools to support those use cases

  • Research and surface new ideas and approaches, whether new technologies, tools, frameworks, or process improvements for the team

REQUIREMENTS

What experience you need:

  • Experience working in data engineering, data architecture, or another similar field

  • Extensive experience manipulating data using SQL

  • Experience using Git to version/manage code

  • Fluency in one or more programming languages such as Python, Java, Go, etc.

  • Experience working with relational databases/data warehouses

  • Familiarity with ETL tools

  • Familiarity with business intelligence/visualization tools

  • [Optional/Preferred]: Experience building custom data pipelines

  • [Optional/Preferred]: Experience structuring and analyzing high volumes of website event data (e.g., impressions, views, clicks, etc.)

  • You must be eligible to work in the United States; visa sponsorship is not available

What will make you successful:

  • Curiosity: Always seeking to understand “why”, always looking to make things better.

  • Passion: You are driven by a love for what you do

  • Optimism: The ability to bounce back quickly when something doesn’t work

  • Action: Knowing when to shift from planning to doing

  • Honesty: Transparency with customers, partners and teammates

  • Entrepreneurial spirit

  • Data-driven mindset

  • An interest in / passion for the outdoors (fishing knowledge not required!)

BENEFITS

  • "Take what you need" PTO Policy

  • 4 additional paid days off specifically to enjoy the outdoors

  • Flexible working schedule

  • Ability to work from home if there is a need

  • Medical, Dental and Vision Insurance - We cover 85% of your premium and 50% for dependents

  • Health Savings Account

  • 401(K) plan

  • Pre-Tax Commuter Benefits

  • Unlimited fruit snacks

Unfortunately, visa sponsorship is not available at this time

Please mention that you come from Remotive when applying for this job.

Help us maintain Remotive! If this link is broken, please just click to report dead link!

similar jobs

  • Auth0 is a pre-IPO unicorn. We are growing rapidly and looking for exceptional new team members to add to our teams and will help take us to the next level. One team, one score. 

    We never compromise on identity. You should never compromise yours either. We want you to bring your whole self to Auth0. If you’re passionate, practice radical transparency to build trust and respect, and thrive when you’re collaborating, experimenting and learning – this may be your ideal work environment.  We are looking for team members that want to help us build upon what we have accomplished so far and make it better every day.  N+1 > N.

    The Data Scientist will help build, scale and maintain the entire data science platform. The ideal candidate will have a deep technical understanding, hands-on experience in building Machine Learning models coming up with valuable insights, and promoting a data-driven culture across the organization. They would not hesitate to wrangle data, if necessary, understand the business objectives and have a good understanding of the entire data stack.

    This position plays a key role in data initiatives, analytics projects, and influencing key stakeholders with critical business insights. You should be passionate for continuous learning, experimenting, applying and contributing towards cutting edge open source Data Science technologies.

    Responsibilities

      • Use Python and the vast array of AI/ML libraries to analyze data and build statistical models to solve specific business problems.

      • Improve upon existing methodologies by developing new data sources, testing model enhancements, and fine-tuning model parameters.

      • Collaborate with researchers, software developers, and business leaders to define product requirements and provide analytical support.

      • Directly contribute to the design and development of automated selection systems.

      • Build customer-facing reporting tools to provide insights and metrics which track system performance.

      • Communicate verbally and in writing to business customers and leadership team with various levels of technical knowledge, educating them about our systems, as well as sharing insights and recommendations

    Basic Qualifications

      • Bachelor's degree in Statistics, Applied Math, Operations Research, Engineering, Computer Science, or a related quantitative field2 years of working experience as a Data ScientistProficient with data analysis and modeling software such as Spark, R, Python etc.

      • Proficient with using scripting language such as Python and data manipulation/analysis libraries such as Scikit-learn and Pandas for analyzing and modeling data.

      • Experienced in using multiple data science methodologies to solve complex business problems.

      • Experienced in handling large data sets using SQL and databases in a business environment.

      • Excellent verbal and written communication.

      • Strong troubleshooting and problem solving skills.

      • Thrive in a fast-paced, innovative environment.

    Preferred Qualifications

      • Master's degree in Statistics, Applied Math, Operations Research, Engineering, Computer Science, or a related quantitative field.

      • 2+ years’ experience as a Data Scientist.

      • Fluency in a scripting or computing language (e.g. Python, Scala, C++, Java, etc.)

      • Superior verbal and written communication skills with the ability to effectively advocate technical solutions to scientists, engineering teams and business audiences.

      • Experienced in writing academic-styled papers for presenting both the methodologies used and results for data science projects.

      • Demonstrable track record of dealing well with ambiguity, ability to self-motivate, prioritizing needs, and delivering results in a dynamic environment.

      • Combination of deep technical skills and business savvy to interface with all levels and disciplines within our and our customer’s organizations

    Skills and Abilities

      • + BA/BS in Computer Science, related technical field or equivalent practical experience.

      • At least 3 years of relevant work experienceAbility to write, analyze, and debug SQL queries.

      • Exceptional Problem solving and analytical skills.

      • Fluent in implementing logistic regression, random forest, XGBoost, bayesian and ARIMA in Python/RExperience in User path navigation with Markov Chain, STAN Bayesian analysis for A/B testingFamiliarity with Sentiment Analysis (NLP) and LSTM AI modelsExperience in full AI/ML life-cycle from model development, training, deployment, testing, refining and iterating.

      • Experience in Tableau, Apache SuperSet, Looker or similar BI tools.

      • Knowledge of AWS Redshift, Snowflake or similar databases

    Preferred Locations:


      • #US; #AR;
    Auth0’s mission is to help developers innovate faster. Every company is becoming a software company and developers are at the center of this shift. They need better tools and building blocks so they can stay focused on innovating. One of these building blocks is identity: authentication and authorization. That’s what we do. Our platform handles 2.5B logins per month for thousands of customers around the world. From indie makers to Fortune 500 companies, we can handle any use case.

    We like to think that we are helping make the internet safer.  We have raised $210M to date and are growing quickly. Our team is spread across more than 35 countries and we are proud to continually be recognized as a great place to work. Culture is critical to us, and we are transparent about our vision and principles

    Join us on this journey to make developers more productive while making the internet safer!
  • 2 weeks ago

    BALANCE FOR BETTER  At Xapo, we embrace our differences and actively foster an inclusive environment where we all can thrive. We’re a flexible, family-friendly environment, and we recognize that everyone has commitments outside of work. We have a goal of reaching gender parity and strongly encourage women to apply to our open positions. Diversity is not a tagline at Xapo; it is our foundation.

    RESPONSIBILITIES

    • Design and build data structures on MPP platform like AWS RedShift and or Druid.io.

    • Design and build highly scalable data pipelines using AWS tools like Glue (Spark based), Data Pipeline, Lambda.

    • Translate complex business requirements into scalable technical solutions.

    • Strong understanding of analytics needs.

    • Collaborate with the team on building dashboards, using Self-Service tools like Apache Superset or Tableau, and data analysis to support business.

    • Collaborate with multiple cross-functional teams and work on solutions which have a larger impact on Xapo business.

    REQUIREMENTS

    • In-depth understanding of data structures and algorithms.

    • Experience in designing and building dimensional data models to improve accessibility, efficiency, and quality of data.

    • Experience in designing and developing ETL data pipelines.

    • Proficient in writing Advanced SQLs, Expertise in performance tuning of SQLs.

    • Programming experience in building high-quality software. Skills with Python or Scala preferred.

    • Strong analytical and communication skills.

    NICE TO HAVE SKILLS

    • Work/project experience with big data and advanced programming languages.

    • Experience using Java, Spark, Hive, Oozie, Kafka, and Map Reduce.

    • Work experience with AWS tools to process data (Glue, Pipeline, Kinesis, Lambda, etc).

    • Experience with or advanced courses on data science and machine learning.

    OTHER REQUIREMENTS

      A dedicated workspace. A reliable internet connection with the fastest speed possible in your area.Devices and other essential equipment that meet minimal technical specifications.Alignment with Our Values.

    WHY WORK FOR XAPO? 

    Shape the Future:  Improve lives through cutting-edge technology, work remotely from anywhere in the world

    Own Your Success: Receive attractive remuneration, enjoy an autonomous work culture and flexible hours, apply your expertise to meaningful work every day

    Expect Excellence: Collaborate, learn, and grow with a high-performance team.

  • Auth0 (US or Argentina)
    2 months ago
    Auth0 is a pre-IPO unicorn. We are growing rapidly and looking for exceptional new team members to add to our teams and will help take us to the next level. One team, one score. 

    We never compromise on identity. You should never compromise yours either. We want you to bring your whole self to Auth0. If you’re passionate, practice radical transparency to build trust and respect, and thrive when you’re collaborating, experimenting and learning – this may be your ideal work environment.  We are looking for team members that want to help us build upon what we have accomplished so far and make it better every day.  N+1 > N.

    The Data engineer will help build, scale and maintain the enterprise data warehouse. The ideal candidate will have a deep understanding of technical and functional designs for Databases, Data Warehousing and Reporting areas. The candidate should feed on challenges and love to be hands on with recent technologies.

    This job plays a key role in data infrastructure, analytics projects, and systems design and development. You should be passionate for continuous learning, experimenting, applying and contributing towards cutting edge open source Data technologies and software paradigms.

    Responsibilities:
    • Contributing at a senior-level to the data warehouse design and data preparation by implementing a solid, robust, extensible design that supports key business flows.
    • Performing all of the necessary data transformations to populate data into a warehouse table structure that is optimized for reporting.
    • Establishing efficient design and programming patterns for engineers as well as for non-technical peoples.
    • Designing, integrating and documenting technical components for seamless data extraction and analysis.
    • Ensuring best practices that can be adopted in our data systems and share across teams.
    • Contributing to innovations and data insights that fuel Auth0’s mission.
    • Working in a team environment, interact with multiple groups on a daily basis (very strong communication skills).

    Skills and Abilities:
    • + BA/BS in Computer Science, related technical field or equivalent practical experience.
    • At least 4 years of relevant work experience
    • Ability to write, analyze, and debug SQL queries.
    • Exceptional Problem solving and analytical skills.
    • Experience with Data Warehouse design, ETL (Extraction, Transformation & Load), architecting efficient software designs for DW platform.
    • Knowledge of database modeling and design in a Data Warehousing context
    • Strong familiarity with data warehouse best practices.
    • Proficiency in Python and/or R.


    Preferred Locations:
    • #AR; #US;
    Auth0’s mission is to help developers innovate faster. Every company is becoming a software company and developers are at the center of this shift. They need better tools and building blocks so they can stay focused on innovating. One of these building blocks is identity: authentication and authorization. That’s what we do. Our platform handles 2.5B logins per month for thousands of customers around the world. From indie makers to Fortune 500 companies, we can handle any use case.

    We like to think that we are helping make the internet safer.  We have raised $210M to date and are growing quickly. Our team is spread across more than 35 countries and we are proud to continually be recognized as a great place to work. Culture is critical to us, and we are transparent about our vision and principles. 

    Join us on this journey to make developers more productive while making the internet safer!

Remotive can help!

Not sure how to apply properly to this job? Watch our live webinar « 3 Mistakes to Avoid When Looking For A Remote Startup Job (And What To Do Instead) ».

Interested to chat with Remote workers? Join our community!