Data Engineer


2 weeks ago

10/04/2019 10:21:23

Job type: Full-time

Hiring from: US & Europe

Salary: $80k – $100k

Category: All others

Legalist is breaking new ground in FinTech and LegalTech. Data Science is one of the pillars of Legalist's continuous innovation, and we're looking for someone who can lead the charge on that front.

You will get to..

  • Use Python, PyCharm, Jupyter to build our products

  • Use AWS, GCP, Kubernetes, Docker, Jenkins to scale our infrastructure

  • Learn at the bleeding edge of web and machine learning technologies

  • Work with a ton of legal data to build analytical tools that support the business team

Legalist is an investment firm that uses tech to invest in lawsuits. We graduated YCombinator as part of their Summer 2016 batch, and have since garnered international press for our pioneering work. You can read about us in NYT, WSJ, The Guardian, Le Monde, The Economist, and many others.

If you're interested in the intersection of finance, technology, and law, then you'll find the problems we work on highly interesting. We scrape millions of court records and build technology that streamlines the process of investing in legal assets, while running investment funds that generate high returns for our investors.


Ideally you will be interested in learning, be proactive, and enjoy using bleeding edge technologies. Formal 'experience' not necessary but demonstration of capability required. It is a well paid role, with salary based on capability. You will be working alongside the CTO & co-founder and 4 talented engineers.

Currently, our platform is based around a backend microservices architecture for different use cases.

We're really looking for people who would love to join a fast growing startup with great financial projections, paying clients, strong investors, and an awesome team where you can always be learning. Our work is multi-disciplinary, and we're looking for engineers with an interest in business as well.


  • Autonomy over a core product and data pipeline

  • Collaborate with engineers to develop and ship features

  • Write efficient, modular, and reusable libraries and abstractions

  • Identify key drivers & insights to improve our analytics engines

  • Participate in code reviews


Applicants are not expected to show advanced understanding of all of the below, but must show willingness, ability, and interest in keeping up with cutting edge technologies and frameworks.

  • 4+ years of experience with machine learning and data science techniques

  • Degree in Computer Science, Statistics, Mathematics or equivalent field

  • Ability to implement best practices

  • Ability to identify key insights and technologies

  • Comfort with independently building out MVPs which can then be built out and supported by the engineering team

  • Experience working with modern data stores such as NoSQL/Postgres, S3, Cassandra or similar

  • Experience working with Cloud Computing technologies (e.g. AWS, Azure, GCP)

  • Ability to communicate technical specifications both verbal and written

Please mention that you come from Remotive when applying for this job.

Help us maintain Remotive! If this link is broken, please just click to report dead link!

similar jobs

  • Snowplow Analytics (US only)

    Ideally located in Eastern or Central Time

    It’s a hugely exciting time here at Snowplow. Over the last 7 years, we’ve grown to a brilliant 50 person team that is spread out over 14 countries with nearly 150 customers and many, many more open source users, and we’re heading for even bigger and better things, fast. 

    At Snowplow, we want to empower people to do transformative things using data. We work with companies from around the world to help them better understand their customers and products, and develop a truly data-driven culture. 

    With an ever-expanding US customer base, we are now looking for a US-based Data Consultant to join our Data Strategy team. You’ll work remotely with the opportunity to travel onsite to our customers, as well as to Europe to meet with the rest of the team & company.

    We would love to hear from you if you are excited about working at the interface of technology and people, translating business requirements into technical specifications and then implementing these. 

    The Opportunity

    As a Data Consultant, you’ll be working with companies of all sizes from across different industries on building out their data capability. Through workshops and co-development sessions you’ll understand their use case and business model, design a Snowplow instrumentation that fits it and then work with their data team to implement it.

    The role combines a mixture of tactical tasks, such as writing data models and implementing tracking, with more strategic responsibilities, such as helping companies understand what data they need and how they should consume it. As you gain experience and progress within the role, your focus as part of the Data Strategy team broadens and you’ll provide value to our customers throughout their entire Snowplow journey.

    The environment you’ll be working in

    Our company values are Transparency, Honesty, Ownership, Inclusivity, Empowerment, Customer-centricity, Growth and Technical Excellence. These aren’t just words we plucked out of thin air, we came up with them together as a company and are continually looking to find new ways to weave these into our day to day operations. From flexible hours and working locations to the way we give feedback, we’re passionate about building a company that supports both company and individual development.

    What you’ll be doing

    Helping companies with their data strategy. You’ll help companies architect their cloud data stack and make the organisational changes to become data-driven business. You’ll also work with them to discover new data use cases, and plan out how to action them.

    Mapping business requirements to data solutions. You’ll work with companies on determining what data they need to collect, and how it will be used. Through holding workshops (both onsite and remote), you’ll gain a deep understanding of our customers’ industries and business models, and will be able to best advise them on how to utilise our technology.

    Delivering excellent technical implementations. From designing tracking to developing data models, you’ll work with our customers to ensure they are getting the most value out of our technology. 

    Be a Snowplow expert. Through working with the Snowplow team and our customers, as well as attending conferences and events, you’ll become a true Snowplow expert, and develop a thorough understanding of the wider data landscape.

    What you bring to the team

    • Analysis is your thing. Ideally, you will possess a degree in an analytical field such as Maths, Economics, Engineering, Computer Science or similar, or have relevant commercial experience.

    • You’re not right at the start of your journey. You have experience within consulting, data analysis, data engineering, or similar. 

    • Learning is COOL. You love working with data and are excited to learn new tools and technologies.

    • Time IS your friend. You are highly organised and able to manage your time effectively. Working on implementations for multiple customers at any given time, you are able to coordinate customer interaction and technical delivery. 

    • You love to explain ideas. You are a clear and confident communicator and are able to explain complex technical principles to people with various technical abilities. 

    • There is no ‘i’ in team. You have some experience in managing complex projects with multiple stakeholders and dependencies on various other teams.

    • You have strong technical skills. You are proficient at performing data analysis in spreadsheets or relational databases. Prior experience with SQL is valuable but not necessary.

    • Process, process, process. You have a mature attitude to security, documentation and process. Our clients trust us with their data. This is a huge responsibility and informs everything we do.

    • You’re a bit of a jet setter. You are willing and able to travel internationally (including to Europe) for work (up to 25% of the time). 

    What you’ll get in return

    • A competitive package including share options

    • 23 days of holiday a year (plus bank holidays)

    • 401(k) and health insurance 

    • MacBook or Dell XPS 13/15

    • Two fantastic company Away Weeks in a different European city each year (the next one will be in November 2019!)

    • Work alongside a supportive and talented team with the opportunity to work on cutting edge technology and challenging problems

    • Grow and develop in a fast-moving, collaborative organisation

    Snowplow is dedicated to building and supporting a brilliant, diverse and hugely inclusive team. We don't discriminate against gender, race, religion or belief, disability, age, marital status or sexual orientation. Whatever your background may be, we welcome anyone with talent, drive and emotional intelligence.

  • Brave Software (US or Canada)
    2 weeks ago

    We are looking for an experienced data engineer to be responsible for the design, development and maintenance of large data collection and processing systems at Brave. This position will focus on scaling our backend systems to handle the increased load of data collection, processing and presentation.


    • The position would primarily work with the statistics and engineering teams, along with day to day communication with the product and marketing team.

    • Design and develop AWS systems to collect and manage business events at scale.

    • Respond to request from the product, engineering and marketing team for timely data analysis.

    • Participate in multi-team planning for future data collection needs.


    • Strong AWS skills with extensive knowledge of the data collection, storage and processing systems that AWS provides. Experience building ETL pipelines at scale.

    • Familiarity with Redshift, Kinesis, Athena, Glue, Data Pipeline, MongoDB, Postgresql.

    • Strong JavaScript and SQL skills.

    • Extensive knowledge of modern open source software develop practices and tools i.e. Git, Github, Unit and integration testing, pull requests, code review etc…

    Additional qualities:

    • Strives to deliver clean, maintainable and testable code

    • Has a passion for user privacy

    • Is open to learning new languages and frameworks

    • Has a soft spot for JavaScript


    • Competitive salary

    • 4 weeks (20 days) of paid vacation per year

    • Excellent medical coverage

    • Generous 401k plan

    • Stock option grant

    • Travel and conference budgets

    • Commuters benefit (On­site only)

    • Hip office in the SoMA neighborhood of SF

    Candidates must be legally authorized to work in the United States or Canada.

  • Job description

    • Lead the build out of our AI layer

    • Flexible/remote work friendly environment

    • Greenfield project

    • Technology first company

    • Join an experienced tech & product team

    Ukufu* (pronounced oo - koo - foo) is a new AI powered content aggregation mobile application for professionals. Ukufu version 1.0 is LIVE on both Android and iOS!  

    We recently closed a round of seed funding from a couple of smart and supportive investors, and we are excited for the next stage of our journey!

    Watch a short message from our CEO:

    (This video was recorded for our Tech Lead role, but it will communicate the gist of what we are about.)

    Our mission is bold:

    Build an intelligence layer around the 10 000 English news-related content pieces that get published every day. Then use this layer to power an easy-to-use category based content aggregation app that helps professionals efficiently consume content across multiple content sources.

    We want to enable a content consumption experience that is simple to use, yet comprehensive in depth and breadth of content.

    We are already 3 months into our journey. You can view a video walkthrough of our mobile application here: or you can download a version via

    Over the next 6 - 12 months, we will be focussed on stage 1, working closely with users to build something amazing that we can then scale up in stage 2.

    Our headquarters are in the Sydney CBD but we have team members around the world. This role can be on-site, remote or a mix of both. Our distributed team structures requires all team members to be flexible around time zones. Each role also has minimum daily crossover time requirements.

    Our remote-friendly work culture and processes have been in place for a couple of years (our team used to work together on a product that reached over 4 million users) and our distributed team structure is working well.

    We work hard at fostering a focused and friendly workplace, where team members are able to do their best work.

    We are looking for someone with outstanding technical experience, a mature attitude and a preference for working with a small smart team, to join us in the role of Machine Learning Engineer at Ukufu.

    You will work directly with the CEO, Tech Lead as well as the development and product team.

    Our team of 8 currently consists of 3 engineers. We aim to add 4 new engineers (including the this role) over the next couple of months.

    Our non-engineering team members include a Design Lead and Product Manager who you will work closely with.

    Our current stack includes Flutter, Python, PHP, Kubernetes and AWS.

    We also use some off the shelf Machine Learning services, which we would like to migrate to a custom system that will be architected and built by you.

    Over time, Machine Learning will drive an increasingly large proportion of our value proposition. Thus this is a senior role with the opportunity to contribute significantly to the success of Ukufu in the role.

    Overview of the role

    • Take ownership of Ukufu’s AI capability and conceptualise, architect and build Ukufu’s Machine Learning system.

    • Work with the product team to translate product requirements into technical capability and provide AI powered feature suggestions.


    • At least 3 years of relevant experience in a similar role.

    • A passion for Machine Learning and real world applications.

    • Previous experience with natural language processing (NLP) frameworks and applications, including text classification, named-entity recognition (NER) and chunking.

    • Strong Python skills.

    • Experience with PyTorch, and NLP frameworks such as SpaCy and Flair.

    • Working knowledge of computer vision frameworks, including implementing convolutional neural network (CNN) based binary and multi-class classifiers using transfer learning in PyTorch and Keras.

    • Excellent written and verbal skills.

    • Relevant University degree.

    • Experience designing and building complex software solutions and related infrastructure.

    • Strong background in OO development with a proficient understanding of fundamental principles such as TDD, DDD, SOLID, DRY and KISS.

    • Familiarity working with Amazon AWS services (e.g. RDS, DMS, S3, EC2, CloudWatch, CloudSearch, ElasticSearch, etc).

    • Working knowledge of devops procedures, including using the Linux command line and using Docker to deploy machine learning applications to Kubernetes clusters.

    • Experience with system monitoring tools.

    • Familiarity with popular server software packages (MySQL, PostgreSQL)

    • Exceptional attention to detail and the ability to manage multiple high priority projects and tasks

    • Passion for solving complex technical problems.

    • Enjoy working in a fast-moving environment.

    Bonus Skills

    • Has contributed to open source projects (provide examples if available)

    • Previous experience working with a distributed team

Remotive can help!

Not sure how to apply properly to this job? Watch our live webinar « 3 Mistakes to Avoid When Looking For A Remote Startup Job (And What To Do Instead) ».

Interested to chat with Remote workers? Join our community!