Data Scientist

Toptal


5 months ago

07/31/2019 10:21:23

Job type: Full-time

Category: All others


Toptal is a global network of top talent in business, design, and technology that enables companies to scale their teams, on-demand. With $100+ million in annual revenue and triple-digit growth, Toptal is the largest fully distributed workforce in the world.

We take the best elements of virtual teams and combine them with a support structure that encourages innovation, social interaction, and fun (see this video from The Huffington Post). We see no borders, move at a fast pace, and are never afraid to break the mold.

Position Description

At Toptal, we measure everything and always rely on data to guide all of our initiatives, including both our long-term strategy and our day-to-day operations. As a Data Scientist on our Analytics Team, you will be working with our cross-functional team to model complex problems, discover actionable business insights, and identify high-impact opportunities. You will be part of a high-energy, fast-paced team responsible for supporting initiatives and operations across the company.

This is a remote position that can be done from anywhere. All communication and resumes must be submitted in English.

We have freedom to choose the best tool for the job. That’s why:

  • Our infrastructure is in Google Cloud Platform,

  • For research we leverage both Python and R,

  • Our ETL pipelines and production models are in Python and Scala.

  • The ultimate goal is solving real business problems. We work on problems that deeply affect the operation of our business. This ranges from predictive models, controlling and optimizing the client flow, and other very open-ended questions. For example, our stakeholders may ask if it is feasible to pursue a certain business direction and, if so, what are the implications on business processes and the costs of moving to that solution.

Responsibilities:

  • Use statistical, algorithmic, data mining, and visualization techniques to model complex problems, identify opportunities, discover solutions, and deliver actionable business insights.

  • Own your projects and use this autonomy to find creative and innovative ways of solving problems and delivering solutions.

  • Handle both parts of the Research & Development process, including clean, rigorous implementations of devised models inside our Analytics system.

  • Be persistent, focused, and disciplined when it comes to finishing your work. At Toptal we always drive research from start to finish - we don’t get distracted; we don’t leave anything unfinished.

  • Communicate data-driven insights and recommendations to key stakeholders.

  • Be in constant communication with team members and other relevant parties and convey results efficiently and clearly via Slack.

Requirements:

  • A strong background in advanced mathematics, in particular in probability theory and statistics, data mining, and machine learning.

  • You must be able to think critically, to look at the big picture and spot what is missing, taking advantage of it to propose improvements and deliver business insights.

  • 4+ years of professional experience in data science, doing exploratory data analysis, testing hypothesis, and building predictive models.

  • Ability to quickly and accurately understand complex new concepts.

  • Proficiency in a programming language of your own choice (R, Python, Matlab, etc.), and previous experience efficiently conducting research and creating ad hoc reports.

  • An excellent ability to learn new programming languages quickly and effectively.

  • Big pluses include: strong experience managing or shipping out a product, managing a team, and working on open source projects.

  • Working experience with Airflow and Dimensional Modeling is a plus.

  • Be excited about collaborating daily with your team and other groups while working via a distributed model.

  • Be eager to help your teammates, share your knowledge with them, and learn from them.

  • Be open to receiving constructive feedback.

  • You must be a world-class individual contributor to thrive at Toptal. You will not be here just to tell other people what to do.

Please mention that you come from Remotive when applying for this job.

Help us maintain Remotive! If this link is broken, please just click to report dead link!

similar jobs

  • Kalepa is looking for Data Scientists to lead efforts at the intersection of machine learning and big data engineering in order to solve some of the biggest problems in commercial insurance.

    Data scientists at Kalepa will be turning vast amounts of structured and unstructured data from many sources (web data, geolocation, satellite imaging, etc.) into novel insights about behavior and risk. You will be working closely with a small team in designing, building, and deploying machine learning models to tackle our customers’ questions.

    Kalepa is a New York based, VC backed, startup building software to transform and disrupt commercial insurance. Nearly one trillion ($1T) dollars are spent globally each year on commercial insurance across small, medium, and large enterprises. However, the process for estimating the risk associated with a given business across various perils (e.g. fire, injury, malpractice) is still reliant on inefficient and inaccurate manual forms or outdated and sparse databases. This information asymmetry leads to a broken set of economic incentives and a poor experience for both businesses and insurers alike. By combining cutting edge data science, enterprise software, and insurance expertise, Kalepa is delivering precision underwriting at scale – empowering every commercial insurance underwriter to be as effective and efficient as possible. Kalepa is turning real-world data into a complete understanding of risk.

    Kalepa is led by a strong team with experiences from Facebook, APT (acquired by Mastercard for $600M in 2015), the Israel Defense Forces, MIT, Berkeley, and UPenn.

    About you:

    ● You want to design a flexible analytics, data science, and AI framework to transform the insurance industry

    ● You have demonstrated success in delivering analytical projects, including structuring and conducting analyses to generate business insights and recommendations

    ● You have in-depth understanding of applied machine learning algorithms and statistics

    ● You are experienced in Python and its major data science libraries, and have deployed models and algorithms in production

    ● You have a good understanding of SQL and non-SQL databases

    ● You value open, frank, and respectful communication

    ● You are a proactive and collaborative problem solver with a “can do” attitude

    ● You have a sincere interest in working at a startup and scaling with the company as we grow

    As a plus:

    • You have experience in NLP and/or computer vision

    • You have familiarity with Spark, Hadoop, or Scala

    • You have experience working with AWS tools

    What you’ll get

    ● Work with an ambitious, smart, and fun team to transform a $1T global industry

    ● Ground floor opportunity – opportunity to build the foundations for the product, team, and culture alongside the founding team

    ● Wide-ranging intellectual challenges working with large and diverse data sets, as well as with a modern technology stack

    ● Competitive compensation package with a significant equity component

    ● Full benefits package, including excellent medical, dental, and vision insurance

    ● Unlimited vacation and flexible remote work policies

    ● Continuing education credits and a healthy living / gym monthly stipend

    [IMPORTANT NOTE]: Salary ranges are for New York based employees. Compensation for remote roles will be adjusted according to the cost of living and market in the specific geography.

  • Description
    Students enroll in Thinkful courses to gain the valuable technical and professional skills needed to take them from curious learners to employed software developers. As an Immersive Course Instructor, you will deliver high quality live workshop content based on existing curriculum, preparing students to successfully transition careers. As an instructor for the specializations part of program, you will deliver high quality live workshop and help to students on topics such as Time Series Analysis, NLP, Deep Learning (using Tensorflow and Keras), or Big Data (using spark, AWS, and hadoop).

    In addition to working directly with students, Instructors are expected to maintain an environment of regular, candid feedback with the Educator Experience team, and to stay on top of important updates via meetings, email, and Slack. Ideal candidates for this team are highly coachable, display genuine student advocacy, and are comfortable working in a complex, rapidly changing environment. 

    Responsibilities:

    • Delivers high quality workshops based on the curriculum materials, and provides live coding demos when appropriate, to supplement written materials and content to provide students with the skills and knowledge to get their first developer job 

    • Maintains and updates the daily and weekly student syllabus which outlines the required homework and assignments, and deadlines for assessments and projects 

    • Provides up to 2 hours daily of on-demand video and chat support for students as they move through the program assignments 

    • Spends up to 4 hours a day prepping for workshops and updating course materials

    • Works with the other Format Leads for engagement formats (Mentor Sessions, Group Sessions, Grading, Technical Coaching, Mock Interviews/Assessments) to ensure that consistent experience is happening for students in immersive courses 

    • Provide constructive feedback to the Instructional Design team on improvements to the course materials and curriculum based on student experience with the materials 
    Requirements:

    • Strong expertise with Time Series Analysis, NLP, Deep Learning (using Tensorflow and Keras), or Big Data (using spark, AWS, and hadoop).

    • Strong expertise with Python, SQL, statistics, supervised and unsupervised learning

    • Expertise with Data Structures and Algorithms, and comfort explaining these topics 

    • Ability to run capstone projects and help students during these weeks from idea to implementation

    • Ability to explain complicated topics clearly and without jargon

    • Strong written and verbal communication skills

    • High level of detail orientation and an exceptional work ethic

    • Enjoy working with people, not just putting your head down and working

    • Must have a reliable, high-speed Internet connection

    • Minimum 3-4 years of professional data science experience

    • Teaching experience, especially in a remote or online class, is a plus
  • Bungee empowers enterprises to drive great business decisions.

    Headquartered in Seattle and founded by Amazon veterans, Bungee enables enterprises to gain access to global data on-demand and drive critical business decisions across industries. We’re a small, fast-growing company with our service already in use by several Fortune 50 companies.

    The Data Scientist will shape Bungee’s product offerings and making our existing core data collection and analytics products more efficient and scalable. You will be a key member of the team with own projects end to end and work directly with customers. You will have the opportunity to solve problems, think creatively and try new things. You will be part of the team which is responsible for building machine learning models that drive our platforms, products, marketing, and business analytics.

    We are looking for people who are willing to propose new and better ways of how to achieve our goals and be able to show us why. Willing to sprint to action to solve a problem, even when conditions are not ideal. Can treat systems and models as a whole to decide what to improve. Always Interested in learning, and figuring out ways to make our products and teams better. Delivers early and often, doesn't get bogged down to get it perfect the first time. Wants to work cooperatively in a team that is welcoming and truly understands the value of data. Appreciates working in a phenomenal environment with high ethical standards and a culture of kindness.

    At Bungee Tech, we believe in a workplace where you can be your best, where you and the company can grow together. We attract a highly diverse group of talented people who are both thinkers and doers. We believe we are only scratching the surface on our opportunity, and we’re looking for incredible people like you to help us on that journey. Come join us!

    Responsibilities

    • Architect & develop and train deep learning models for knowledge extraction, entity matching, entity resolution, reinforcement learning, knowledge base extraction.

    • Architect, debug, and improve deep learning models.

    • Improve the accuracy of existing machine learning systems

    • 2+ years of industry experience

    • Participate in team discussions, presentations.

    Requirements

    • Masters or PhD in Computer Science or equivalent

    • Experience in implementing deep learning methods and algorithms, in computer vision and/or NLP

    • Experience optimizing machine learning benchmarks to competitive levels of accuracy

    • 2+ years of industry experience

    • Experience with TensorFlow/Pytorch/Keras, AWS/GCP/Azure, Python

    • Experience in writing algorithms for speed and scalability

    Preferred Skills

    • Experience with learning in distributed systems, containerization, etc.

    • Published research in NLP & CV areas

Remotive can help!

Not sure how to apply properly to this job? Watch our live webinar « 3 Mistakes to Avoid When Looking For A Remote Startup Job (And What To Do Instead) ».

Interested to chat with Remote workers? Join our community!