Remote big data Jobs in April 2020

16 Remote big data Jobs in April 2020

Post a job
  • Hottest Remote Jobs

    • Thorn (US only - East Coast Preferred)
      1 week ago

      Thorn is a non-profit focused on building technology to defend children from sexual abuse. Working at Thorn gives you the opportunity to apply your skills, expertise and passions to directly impact the lives of vulnerable and abused children. Our staff solves dynamic, quickly evolving problems with our network of partners from tech companies, NGOs, and law enforcement agencies. If you are able to bring clarity to complexity and lightness to heavy problems, you could be a great fit for our team.

      Earlier this year, we took the stage at TED and shared our audacious goal of eliminating child sexual abuse material from the internet. A key aspect of our work is partnering with the National Center for Missing & Exploited Children and building technology to optimize the broader ecosystem combating online child sexual abuse.

      What You'll Do

      • Collaborate with other engineers on your team to build a data pipeline and client application from end-to-end.

      • Prototype, implement, test, deploy, and maintain stable data engineering solutions.

      • Work closely with the product manager and engineers to define product requirements.

      • Present possible technical solutions to various stakeholders, clearly explaining your decisions and how they address real user needs, incorporating feedback in subsequent iterations.

      Skills We're Seeking

      • You have a commitment to putting the children we serve at the center of everything you do.

      • You have proficient software development knowledge, with experience building, growing, maintaining a variety of products, and a love for creating elegant applications using modern technologies.

      • You’re experienced with devops (Docker, AWS, microservices) and can launch and maintain new services.

      • You are experienced with distributed data storage systems/formats such as MemSQL, Snowflake, Redshift, Druid, Cassandra, Parquet, etc.

      • You have worked with real-time systems using various open source technologies like Spark, MapReduce, NoSQL, Hive, etc.

      • You have knowledge in data modeling, data access, and data storage techniques for big data platforms.

      • You have an ability and interest in learning new technologies quickly.

      • You can work with shifting requirements and collaborate with internal and external stakeholders.

      • You have experience prototyping, implementing, testing, and deploying code to production.

      • You have a passion for product engineering and an aptitude to work in a collaborative environment, can demonstrate empathy and strong advocacy for our users, while balancing the vision and constraints of engineering.

      • You communicate clearly, efficiently, and thoughtfully. We’re a highly-distributed team, so written communication is crucial, from Slack to pull requests to code reviews.

      Technologies We Use

      You should have experience with at least a few of these, and a desire and ability to learn the rest.

      • Python

      • Elasticsearch / PostgreSQL

      • AWS / Terraform

      • Docker / Kubernetes

      • Node / Typescript

  • Software Development (10) Software Development rss feed

    • At Numbrs, our engineers don’t just develop things – we have an impact. We change the way how people are managing their finances by building the best products and services for our users. 

      Numbrs engineers are innovators, problem-solvers, and hard-workers who are building solutions in big data, mobile technology and much more. We look for professional, highly skilled engineers who evolve, adapt to change and thrive in a fast-paced, value-driven environment.

      Join our dedicated technology team that builds massively scalable systems, designs low latency architecture solutions and leverages machine learning technology to turn financial data into action. Want to push the limit of personal finance management? Join Numbrs.

      Job Description

      You will be a part of a team that is responsible for developing, releasing, monitoring and troubleshooting large scale micro-service based distributed systems with high transaction volume. You enjoy learning new things and are passionate about developing new features, maintaining existing code, fixing bugs, and contributing to overall system design. You are a great teammate who thrives in a dynamic environment with rapidly changing priorities.

      All candidates will have

      • a Bachelor's or higher degree in technical field of study or equivalent practical experience
      • experience with high volume production grade distributed systems
      • experience with micro-service based architecture
      • experience with software engineering best practices, coding standards, code reviews, testing and operations
      • hands-on experience with Spring Boot
      • professional experience in writing readable, testable and self-sustaining code
      • strong hands-on experience with Java (minimum 8 years)
      • knowledge of AWS, Kubernetes, and Docker
      • excellent troubleshooting and creative problem-solving abilities
      • excellent written and oral communication in English and interpersonal skills

      Ideally, candidates will also have

      • experience with Big Data technologies such as Kafka, Spark, and Cassandra
      • experience with CI/CD toolchain products like Jira, Stash, Git, and Jenkins
      • fluent with functional, imperative and object-­oriented languages;
      • experience with Scala, C++, or Golang
      • knowledge of Machine Learning

      Location: residence in UK mandatory; home office

    • 4 days ago
      New Context is a rapidly growing consulting company in the heart of downtown San Francisco. We specialize in Lean Security: a methodology to consistently apply DevSecOps strategies, leading organizations to build better, safer software through hands-on technical and management consulting. . We are a group of engineers who live and breathe Agile Infrastructure, Systems Automation, Cloud Orchestration, and Information & Application Security.

      We are seeking a Managed Services Engineer to work remotely as part of a team supporting services for our MSP clients. Support duties include break/fix work, scaling and capacity planning, and participating in a scheduled on-call rotation.

      Expect to heavily use Open Source software to take on challenges like delivery of highly secured containers, management of IoT devices or supporting Big Data ecosystems at petabyte scale and beyond. You will utilize our core methodologies - Agile, Lean, TDD and Pair Programming - along with your fluency in DevOps - to run robust and reliable systems for our clients. 

      We foster a tight-knit, highly-supportive environment where there are no stupid questions. Even if you do not know the answer immediately, you'll have the entire company supporting you via Slack, Zoom, or in-person. We also host a daily, all-company stand-up via Zoom, and a weekly company Retro, so you won't just be a name on an email. 

      At New Context, our core values are Humility, Accountability, Integrity, Empathy, Creativity, Transparency, Quality & Passion! Our employees live these values every single day.

      Who You Are:
      • Engineer with 2-4 years of IT Experience (Desktop Support, Systems Engineering, Systems Administration);
      • Comfortable with at least one high-level language, ideally Ruby and/or Python (but could be anything); 
      • Experienced in Open Source web technologies, especially Docker, Kubernetes, GitLab, and/or the HashiCorp suite;
      • An excellent communicator, able to deliver project status and explain technical elements to non-technical audiences;
      • Able to think on your feet and learn quickly on-the-job in order to meet the expectations of our clients;
      • A great teammate and a creative and independent thinker.
      Bonus points if you are:
      • Experienced with in-depth AWS administration;
      • Experienced and effective working with external clients and customers;
      • A believer in automated tests and their role in software engineering;
      • Able to translate complex concepts to business customers
      We tailor solutions to our customers. You might support any of the following technologies:
      • Automation: HashiCorp Product Suite (Terraform, Packer, Vault, Consul, Vagrant), Chef, Puppet, Ansible, Salt, Automated Testing
      • Containerization Ecosystem: Kubernetes, Docker, D2IQ, Nomad, Rancher, CoreOS 
      • Cloud & Virtualization: AWS, GCP, Azure, OpenStack, Cloudstack, kvm, libvirt
      • Tools: GitLab, Jenkins, Atlassian Suite, Pivotal Tracker, Git, 
      • Monitoring: SysDig, Tistlock, Datadog, AppDynamics, New Relic, Sentry, Nagios, Prometheus
      • Databases/Datastores: Cassandra, Hadoop, Redis, PostgreSQL, MySQL
      • Security: Compliance Automation, Application Security, Firewalls, OSSEC, AuthN and AuthZ
      • Languages: Ruby, Python, Go, JavaScript
      We are committed to equal-employment principles, and we recognize the value of committed employees who feel they are being treated in an equitable and professional manner. We are passionate about finding ways to attract, develop and retain the talent and unique viewpoints needed to meet business objectives, and to recruit and employ highly qualified individuals representing the diverse communities in which we live, because we believe that this diversity results in conversations which stimulate new and innovative ideas.

      Employment policies and decisions on employment and promotion are based on merit, qualifications, performance, and business needs. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
    • Thorn (US only - East Coast Preferred)
      1 week ago

      Thorn is a non-profit focused on building technology to defend children from sexual abuse. Working at Thorn gives you the opportunity to apply your skills, expertise and passions to directly impact the lives of vulnerable and abused children. Our staff solves dynamic, quickly evolving problems with our network of partners from tech companies, NGOs, and law enforcement agencies. If you are able to bring clarity to complexity and lightness to heavy problems, you could be a great fit for our team.

      Earlier this year, we took the stage at TED and shared our audacious goal of eliminating child sexual abuse material from the internet. A key aspect of our work is partnering with the National Center for Missing & Exploited Children and building technology to optimize the broader ecosystem combating online child sexual abuse.

      What You'll Do

      • Collaborate with other engineers on your team to build a data pipeline and client application from end-to-end.

      • Prototype, implement, test, deploy, and maintain stable data engineering solutions.

      • Work closely with the product manager and engineers to define product requirements.

      • Present possible technical solutions to various stakeholders, clearly explaining your decisions and how they address real user needs, incorporating feedback in subsequent iterations.

      Skills We're Seeking

      • You have a commitment to putting the children we serve at the center of everything you do.

      • You have proficient software development knowledge, with experience building, growing, maintaining a variety of products, and a love for creating elegant applications using modern technologies.

      • You’re experienced with devops (Docker, AWS, microservices) and can launch and maintain new services.

      • You are experienced with distributed data storage systems/formats such as MemSQL, Snowflake, Redshift, Druid, Cassandra, Parquet, etc.

      • You have worked with real-time systems using various open source technologies like Spark, MapReduce, NoSQL, Hive, etc.

      • You have knowledge in data modeling, data access, and data storage techniques for big data platforms.

      • You have an ability and interest in learning new technologies quickly.

      • You can work with shifting requirements and collaborate with internal and external stakeholders.

      • You have experience prototyping, implementing, testing, and deploying code to production.

      • You have a passion for product engineering and an aptitude to work in a collaborative environment, can demonstrate empathy and strong advocacy for our users, while balancing the vision and constraints of engineering.

      • You communicate clearly, efficiently, and thoughtfully. We’re a highly-distributed team, so written communication is crucial, from Slack to pull requests to code reviews.

      Technologies We Use

      You should have experience with at least a few of these, and a desire and ability to learn the rest.

      • Python

      • Elasticsearch / PostgreSQL

      • AWS / Terraform

      • Docker / Kubernetes

      • Node / Typescript

    • We are Givelify®, where fintech meets philanthropy. We help people instantly find causes that inspire them to action so they can change the world—one simple, joyful gift at a time. 
       
      We are looking for a Principle Software Engineer with over 10+ years of experience to join our team with a "lead the charge" attitude. Why is Engineering at Givelify Different: Moonshots are our norm. Our product impacts real people on the ground. We build with passion and maintain a high standard of engineering quality. We solve unique scalability challenges and you will have the ability to help guide all aspects of Engineering and Product Development. 

      Some of the meaningful work you will perform:

      • PHP/Python: You will need to have strong object-oriented design and development skills and advanced knowledge of PHP, Python or similar programming languages. Knowledge and experience with third party libraries, frameworks, and technologies is a plus. 
      • Database: You will need to have strong SQL composition skills. Knowledge of big data and NoSql databases is a plus! We not only write software that collects and queries data, but we also compose queries for investigation and analysis. We collect a lot of data in real time from our applications and being able to compose ad hoc queries is necessary to develop and support our products. 
      • Guidance: Participate in and guide engineer teams on all things technical – Architecture definition & Design ownership that not only include technology but Data Security aspects, Deployment & Cloud strategy, CI/CD, as well as coding best practices.  
      • Analysis & Problem Solving: You will need to understand our codebase and systems and the business requirements they implement so you can effectively make changes to our applications and investigate issues. 
      • Communication: Whether via face-to-face discussion, phone, email, chat, white-boarding, or other collaboration platforms, you must be an effective communicator who can inform, explain, enable, teach, persuade, coordinate, etc. 
      • Team Collaboration: You must be able to effectively collaborate and share ownership of your team’s codebase and applications. You must be willing to fully engage in team efforts, speak up for what you think are the best solutions, and be able to converse respectfully and compromise when necessary. 
      • Knowledge and Experience: A well-rounded software engineer will have broad and/or deep knowledge of various topics, tools, frameworks, and methodologies related to software engineering. None are required, but the more you can bring, the better. Here are some examples:

      • Laravel, Yii and similar frameworks
      • Strong API knowledge and development
      • Git and GitHub
      • Big Data solutions such as Cassandra, Hadoop, Spark, Kafka and Elastic Search
      • Continuous integration and automated testing
      • Agile/Scrum
      • Open source projects
      • Server-side JavaScript, TypeScript and Node.js
      • Familiarity with DevOps configuration tools (Git, Jira, Jenkins, etc.)

      We welcome your talents and experience:

      • BS/MS degree in Computer Science, Computer Engineering, Mathematics, Physics or equivalent work experience
      • Technical Leader with at least 10+ years of work in Software Engineering
      • Webservices and API development experience within a startup and/or e-commerce environment
      • A distinguished member of engineering community, either through extracurricular activities, publications, associations with orgs (i.e., IEEE, etc.)
      • Demonstrated history of living the values important to Givelify suchas integrity and ethics

      Our People 
      We are a virtual team of high-performing professionals who innovate & collaborate to fulfill our mission to help people instantly find causes that inspire them to action so they can change the world – one simple, joyful gift at a time. Our culture of integrity, heart, simplicity, & that "wow" factor fuel our aspiration to be among the tech industry's most inclusive & purpose-driven work environments. 
      We take great pride in providing competitive pay, full benefits, amazing perks, and most importantly, the opportunity to put passion & purpose to work. 
       
      Our Product 
      From places of worship to world-changing nonprofit groups, Givelify harnesses the power of technology to bridge the gap between people and the causes they care about. Tap. Give. Done. Givelify's payment solution is designed to make the experience of giving as beautiful as the act of giving. 
       
      Learn more about us at https://careers.givelify.com ( https://careers.givelify.com/ )

    • Railroad19 (US only)
      1 week ago
      We are looking for a savvy Data Engineer to join our growing team of analytics experts. The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.  The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.

      Responsibilities for Data Engineer
      • Create and maintain optimal data pipeline architecture,
      • Assemble large, complex data sets that meet functional / non-functional business requirements.
      • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
      • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
      • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
      • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
      • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
      • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
      • Work with data and analytics experts to strive for greater functionality in our data systems.
      Qualifications for Data Engineer
      • Understanding of concepts such as Change Data Capture, Event Sourcing, and CQRS patterns using event based systems
      • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
      • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
      • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
      • Strong analytic skills related to working with unstructured datasets.
      • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
      • A successful history of manipulating, processing and extracting value from large disconnected datasets.
      • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
      • Strong project management and organizational skills.
      • Experience supporting and working with cross-functional teams in a dynamic environment.
      • We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
      • Experience with stream-processing systems: Kafka, Nifi, Storm, Spark-Streaming, etc.
      • Strong knowledge of object-oriented/functional programming with Java 8+ or other JVM languages (Scala, Clojure, Kotlin, Groovy)
      • Hands-on experience with ETL techniques and frameworks like Apache Spark or Apache Flume.
      • Strong understanding of data serialization formats like Apache Avro, Parquet, Protobuf, Apache Thrift.
      • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra, MongoDB, ElasticSearch.
      • Use of AWS cloud services: EC2, EMR, RDS, Redshift, S3, Lambda, Kinesis.
      • Experience with integration of data from multiple data sources.
      • Understanding of the importance of CI/CD, unit/integration testing, build tooling (maven, gradle, sbt), dependency management.


      About RR19
      • We develop customized software solutions and provide software development services.  We’re a specialized team of developers and architects.  As such, we only bring an “A” team to the table, through hard work and a desire to lead the industry — this is our company culture — this is what sets Railroad19 apart.
      • At Railroad19, Inc. you are part of a company that values your work and gives you the tools you need to succeed. We are headquartered in Saratoga Springs, New York, but we are a distributed team of remote developers across the US. 
      • As a Railroad19 employee, you will be part of a company that values your work and gives you the tools you need to succeed. Our Executive headquarters is in Saratoga Springs, New York, but this position is remote. Railroad19 provides competitive compensation and excellent benefits~ Medical/Dental/Vision vacation and 401K.
      Working at Railroad19:
      Competitive salaries
      Excellent Health Care, Dental and Vision benefits
      3 weeks vacation, 401K, work life balance
      No Agencies***
      This is a non-management position

      We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender, gender identity or expression, or veteran status. We are proud to be an equal opportunity workplace.
    • Do you live and breathe AWS certifications? 
      We are looking for a certification Specialist to help us build and maintain our AWS certification content library.

      Cloud Academy is the leading digital skills development platform that enables every enterprise to become a tech company through guided Learning Paths, Hands-on Labs, and Skill Assessment. Cloud Academy delivers role-specific training on leading cloud technologies (AWS, Azure, Google Cloud Platform), essential methodologies needed to operate on and between clouds (DevOps, security, containers), and capabilities that are unlocked by the cloud (big data, machine learning).

      As a member of our Content Team, you will be challenged to produce and maintain educational content for Cloud Academy, a renowned training provider for thousands of people worldwide. Our customers vary from US-based Fortune 500 companies to IT professionals in Japan to small teams of developers in the UK. You’ll report to the AWS Content & Security Lead, and your time will be divided between creating new learning content yourself and collaborating with our internal content, product, and engineering teams to make sure Cloud Academy creates the best possible product and learning materials for our users.

      Responsibilities

      • Ensure the content within all learning paths relating to AWS certifications are accurate and up to date
      • Design and create/update new/existing online courses, quizzes, labs, and learning paths related to AWS services and certification tracks
      • Collaborate with the AWS Content & Security Lead and the rest of the content team to define the strategy and roadmap for Cloud Academy’s content
      • Work with our marketing and sales teams to support their efforts to promote Cloud Academy content, including through creating blog posts and helping to build our network and community

      Requirements

      • Infrastructure or developer experience on AWS
      • Hold one or more AWS certifications already and have a passion for gaining more at the Professional or Specialty level
      • Technically proficient with cloud computing - security concepts, machine learning principles, database, and big data technologies
      • Superior oral and written communication skills

      Bonus Points (you should have deep experience with some of the below):

      • Application development languages, such as C#, Java, Python, Ruby, PHP
      • Version control tools, such as git
      • Continuous integration and continuous deployment
      • Windows and/or Linux servers
      • Configuration management tools, such as Chef, Puppet, Ansible
      • Authentication and access control mechanisms
      • Network architectures

      Benefits:

      • Competitive salary with an annual bonus
      • Budget for professional development
      • 4 weeks paid vacation per year
      • Great company culture and work environment
      • Highly-skilled teammates and lots of opportunities for growth and development

      We value diversity and are an equal opportunity employer at Cloud Academy. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

    • Wikimedia Foundation, Inc.
      2 weeks ago

      The Wikimedia Foundation is hiring two Site Reliability Engineers to support and maintain (1) the data and statistics infrastructure that powers a big part of decision making in the Foundation and in the Wiki community, and (2) the search infrastructure that underpins all search on Wikipedia and its sister projects. This includes everything from eliminating boring things from your daily workflow by automating them, to upgrading a multi-petabyte Hadoop or multi-terabyte Search cluster to the next upstream version without impacting uptime and users.

      We're looking for an experienced candidate who's excited about working with big data systems. Ideally you will already have some experience working with software like Hadoop, Kafka, ElasticSearch, Spark and other members of the distributed computing world. Since you'll be joining an existing team of SREs you'll have plenty of space and opportunities to get familiar with our tech (AnalyticsSearchWDQS), so there's no need to immediately have the answer to every question.

      We are a full-time distributed team with no one working out of the actual Wikimedia office, so we are all together in the same remote boat. Part of the team is in Europe and part in the United States. We see each other in person two or three times a year, either during one of our off-sites (most recently in Europe), the Wikimedia All Hands (once a year), or Wikimania, the annual international conference for the Wiki community.

      Here are some examples of projects we've been tackling lately that you might be involved with:

      •  Integrating an open-source GPU software platform like AMD ROCm in Hadoop and in the Tensorflow-related ecosystem
      •  Improving the security of our data by adding Kerberos authentication to the analytics Hadoop cluster and its satellite systems
      •  Scaling the Wikidata query service, a semantic query endpoint for graph databases
      •  Building the Foundation's new event data platform infrastructure
      •  Implementing alarms that alert the team of possible data loss or data corruption
      •  Building a new and improved Jupyter notebooks ecosystem for the Foundation and the community to use
      •  Building and deploying services in Kubernetes with Helm
      •  Upgrading the cluster to Hadoop 3
      •  Replacing Oozie by Airflow as a workflow scheduler

      And these are our more formal requirements:

      •    Couple years experience in an SRE/Operations/DevOps role as part of a team
      •    Experience in supporting complex web applications running highly available and high traffic infrastructure based on Linux
      •    Comfortable with configuration management and orchestration tools (Puppet, Ansible, Chef, SaltStack, etc.), and modern observability infrastructure (monitoring, metrics and logging)
      •    An appetite for the automation and streamlining of tasks
      •    Willingness to work with JVM-based systems  
      •    Comfortable with shell and scripting languages used in an SRE/Operations engineering context (e.g. Python, Go, Bash, Ruby, etc.)
      •    Good understanding of Linux/Unix fundamentals and debugging skills
      •    Strong English language skills and ability to work independently, as an effective part of a globally distributed team
      •    B.S. or M.S. in Computer Science, related field or equivalent in related work experience. Do not feel you need a degree to apply; we value hands-on experience most of all.

      The Wikimedia Foundation is... 

      ...the nonprofit organization that hosts and operates Wikipedia and the other Wikimedia free knowledge projects. Our vision is a world in which every single human can freely share in the sum of all knowledge. We believe that everyone has the potential to contribute something to our shared knowledge, and that everyone should be able to access that knowledge, free of interference. We host the Wikimedia projects, build software experiences for reading, contributing, and sharing Wikimedia content, support the volunteer communities and partners who make Wikimedia possible, and advocate for policies that enable Wikimedia and free knowledge to thrive. The Wikimedia Foundation is a charitable, not-for-profit organization that relies on donations. We receive financial support from millions of individuals around the world, with an average donation of about $15. We also receive donations through institutional grants and gifts. The Wikimedia Foundation is a United States 501(c)(3) tax-exempt organization with offices in San Francisco, California, USA.

      The Wikimedia Foundation is an equal opportunity employer, and we encourage people with a diverse range of backgrounds to apply.

      U.S. Benefits & Perks*

      • Fully paid medical, dental and vision coverage for employees and their eligible families (yes, fully paid premiums!)
      • The Wellness Program provides reimbursement for mind, body and soul activities such as fitness memberships, baby sitting, continuing education and much more
      • The 401(k) retirement plan offers matched contributions at 4% of annual salary
      • Flexible and generous time off - vacation, sick and volunteer days, plus 19 paid holidays - including the last week of the year.
      • Family friendly! 100% paid new parent leave for seven weeks plus an additional five weeks for pregnancy, flexible options to phase back in after leave, fully equipped lactation room.
      • For those emergency moments - long and short term disability, life insurance (2x salary) and an employee assistance program
      • Pre-tax savings plans for health care, child care, elder care, public transportation and parking expenses
      • Telecommuting and flexible work schedules available
      • Appropriate fuel for thinking and coding (aka, a pantry full of treats) and monthly massages to help staff relax
      • Great colleagues - diverse staff and contractors speaking dozens of languages from around the world, fantastic intellectual discourse, mission-driven and intensely passionate people

      *Eligible international workers' benefits are specific to their location and dependent on their employer of record

    • Railroad19 (US only)
      2 weeks ago

      We are looking for a savvy Data Engineer to join our growing team of analytics experts. The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.  The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.

      Responsibilities for Data Engineer

        • Create and maintain optimal data pipeline architecture,
        • Assemble large, complex data sets that meet functional / non-functional business requirements.
        • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
        • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
        • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
        • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
        • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
        • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
        • Work with data and analytics experts to strive for greater functionality in our data systems.

      Qualifications for Data Engineer

        • Understanding of concepts such as Change Data Capture, Event Sourcing, and CQRS patterns using event-based systems
        • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
        • Experience building and optimizing ‘big data’ data pipelines, architectures, and data sets.
        • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
        • Strong analytic skills related to working with unstructured datasets.
        • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
        • A successful history of manipulating, processing and extracting value from large disconnected datasets.
        • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
        • Strong project management and organizational skills.
        • Experience supporting and working with cross-functional teams in a dynamic environment.
        • We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
        • Experience with stream-processing systems: Kafka, Nifi, Storm, Spark-Streaming, etc.
        • Strong knowledge of object-oriented/functional programming with Java 8+ or other JVM languages (Scala, Clojure, Kotlin, Groovy)
        • Hands-on experience with ETL techniques and frameworks like Apache Spark or Apache Flume.
        • Strong understanding of data serialization formats like Apache Avro, Parquet, Protobuf, Apache Thrift.
        • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra, MongoDB, ElasticSearch.
        • Use of AWS cloud services: EC2, EMR, RDS, Redshift, S3, Lambda, Kinesis.
        • Experience with integration of data from multiple data sources.
        • Understanding of the importance of CI/CD, unit/integration testing, build tooling (maven, gradle, sbt), dependency management.

      About RR19

        • We develop customized software solutions and provide software development services.  We’re a specialized team of developers and architects.  As such, we only bring an “A” team to the table, through hard work and a desire to lead the industry — this is our company culture — this is what sets Railroad19 apart.
        • At Railroad19, Inc. you are part of a company that values your work and gives you the tools you need to succeed.
        • We are headquartered in Saratoga Springs, New York, but we are a distributed team of remote developers across the US. 
        • Railroad19 provides competitive compensation and excellent benefits~ Medical/Dental/Vision vacation and 401K.
      Working at Railroad19:

      • Competitive salaries
      • Excellent Health Care, Dental and Vision benefits
      • 3 weeks vacation, 401K, work life balance
      • No Agencies***
      • This is a non-management position
      We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender, gender identity or expression, or veteran status. We are proud to be an equal opportunity workplace.
    • SemanticBits (US only)
      3 weeks ago

      SemanticBits is looking for a talented Senior Data Engineer who is eager to apply computer science, software engineering, databases, and distributed/parallel processing frameworks to prepare big data for the use of data analysts and data scientists. You will mentor junior engineers and deliver data acquisition, transformations, cleansing, conversion, compression, and loading of data into data and analytics models. You will work in partnership with data scientists and analysts to understand use cases, data needs, and outcome objectives. You are a practitioner of advanced data modeling and optimization of data and analytics solutions at scale. Expert in data management, data access (big data, data marts, etc.), programming, and data modeling; and familiar with analytic algorithms and applications (like machine learning).

      Requirements

      • Bachelor’s degree in computer science (or related) and eight years of professional experience
      • Strong knowledge of computer science fundamentals: object-oriented design and programming, data structures, algorithms, databases (SQL and relational design), networking
      • Demonstrable experience engineering scalable data processing pipelines.
      • Demonstrable expertise with Python, Spark, and wrangling of various data formats - Parquet, CSV, XML, JSON.
      • Experience with the following technologies is highly desirable: Redshift (w/Spectrum), Hadoop, Apache NiFi, Airflow, Apache Kafka, Apache Superset, Flask, Node.js, Express, AWS EMR, Scala, Tableau, Looker, Dremio
      • Experience with Agile methodology, using test-driven development.
      • Excellent command of written and spoken English
      • Self-driven problem solver
    • 3 weeks ago
      Company Overview
      At Netlify, we're building a platform to empower digital designers and developers to build better, more elaborate web projects than ever before. We're aiming to change the landscape of modern web development.

      Currently, we are looking for a Data Engineer to help us with that mission. If you are someone who enjoys helping people succeed with technology, aren’t afraid to help improve our practices and our product, and are an effective and enthusiastic self-directed learner, read on.

      As the company scales with light-speed, data science is critical for all sorts of decisions.We as a company value data science and have a centralized and growing data science team. Our data science team started in 2018 when we were about 20 people as a company! Now we are a company of 90 people and growing fast! The hire will be responsible for setting and maintaining our data environment, management, and pipeline. Databricks is our primary data environment.

      We’re a venture-backed company, and so far we've raised about $45 million from Andreessen Horowitz, Kleiner Perkins, Bloomberg, and prominent founders and professionals in our space.

      Netlify is a diverse group of incredible talent from all over the world. We’re ~40% woman or non-binary, and are composed of about half as many nationalities as we are team members.

      About the role:
      • Determine and construct data schema to support analytical and modeling needs
      • Work on creating and maintaining existing ETL to get data for business users and support our BI dashboard
      • Assist in developing a framework to automate ingestion and integration of structured data from a wide variety of data sources
      • Use tools, processes, guidelines to ensure data is correct, standardized and documented.
      • Assist in scaling pipelines to meet performance requirements
      • Collaborate and work alongside data engineers to set up and maintain the production environment to support data science workflow
      • Work on a Data retention strategy for different pipelines/sources and help with implementation
      • Build frameworks to validate data integrity and work with Infra data engineers to improve the testability of our data pipeline
      Qualifications

      The data engineer will have an immediate connection to our values. This individual contributor will be extremely flexible and enjoy a “startup” mentality and environment that changes day-to-day.

      • 2+ years of professional engineering experience, working with SQL & relational databases building data pipelines
      • Experience in data extraction, cleaning, mining and maintaining data pipelines
      • Ability to analyze sources of data, build and maintain schemas
      • Experience building frameworks to validate data integrity and improve the testability of our data pipelines
      • Proved ability to keep proper documentation
      • BS in computer science or related background
      Nice to have:
      • Experience with building scalable and reliable data pipelines using big data engine technologies like Spark, AWS EMR, Redshift, etc.
      • Experience with cloud technologies such as AWS or Google Cloud
      • Experience working with BI
      • SaaS Experience
      • Some knowledge of web development
      About Netlify

      Of everything we've ever built at Netlify, we are most proud of our team.

      We believe that empowered, engaged colleagues do their best work. We’ll be giving you the tools you need to succeed and looking to you for suggestions to improve not just in your daily job, but every aspect of building a company. Whether you work from our main office in San Francisco or you are a remote employee, we’ll be working together a lot—paring, collaborating, debating, and learning. We want you to succeed! About 60% of the company are remote across the globe, the rest are in our HQ in San Francisco.

      To learn a bit more about our team and who we are, make sure to visit our about page.

      Applying

      Not sure you meet 100% of our qualifications? Please apply anyway!

      With your application, please include: A thoughtful cover letter explaining why you would enjoy working in this role and why you’d like to work at Netlify. A resume or short listing of job history. (A link to a LinkedIn profile would be fine.)

      When we receive your complete application with the items above, we’ll get back to you about the next steps.

  • Marketing / Sales (3) Marketing / Sales rss feed

    • Emptor Demand Generation Analyst
      We are looking for a marketing analyst who will be able to work independently to set, measure, track and report on a newly forming marketing organization. This person will also have a high degree of comfort working with content marketing and creating customer and prospect facing collateral.  We are looking for someone who is a self-starter and able to think creatively to find new leads, reach their needs, and suggest solutions.

      About the team
      Founded in 2016 and headquartered in New York, Emptor is a fully remote B2B SaaS startup. We are focused on solving trust and safety issues in Latin America by using big data. Emptor currently serves large multinational technology firms operating in the region by building tools for operational decision making on a large scale. We have opportunities in product, sales, finance, infrastructure, architecture, web scraping, NLP, ML and data science.

      Location
      Remote - Global team, 8:00 AM UTC-5 to 8:00 PM UTC-5.

      Requirements
      -BS/MS degree.
      -2-3 years of proven working experience in digital marketing.
      -Experience with SEO/SEM, marketing database, email, social media and/or display advertising campaign.
      -Knowledge of website analytics tools (e.g., Google Analytics).
      -Up-to-date with the latest trends and best practices in online marketing and measurement.
      -Strong writing skills.
      -Can synthesize feedback from different stakeholders and incorporate that into their work. 
      -Excellent written and verbal communication.
      -Analytical background.
      -Comfortable working in a startup.

      Expectations as a Demand Generation Analyst:
      Practices & Behavior:
      -Manage custom lead lists, create landing pages, and support lead nurturing campaigns.
      -Design, prepare, and support marketing campaigns for targeted and mass market accounts.
      -Set, analyze, model, and forecast Key Performance Indicators (KPIs) and website analytics across campaigns.
      -Create and monitor analytics for social campaigns (LinkedIn, Facebook, Twitter, Email, Blog).
      -Build reports on campaign, landing page, and web analytics with changes and optimizations needed for the future.
      -Determine content gaps and collaborate with cross functional teams to develop new assets.
      -Help create sales collateral such as demo decks and one pagers.
      -Develop case studies and other content based on customer stories.
      -Work on articles for Emptor blog.
      -Develop monthly and quarterly multi-channel program analysis to ensure revenue goals are met.
      -Monitor and report lead scoring and recommendations.
      -Integrate with marketing CRM.

      Who can apply?
      Preferred careers:
      -Demand generation analyst.
      -Content Manager.
      -Marketing Manager at Saas tech startup.

      Note: Emptor will not sponsor applicants for work visas.
    • 4 weeks ago

      D2iQ is hiring an Enterprise Account Executive in the Southeast USA to bring our world-class solutions to our world-class customers. We need people who have managed complex sales relationships at the largest customers in the world and still understand how to work with people at all levels of an organization. We get to work with the best technology and the best technologists there are. We get to help our customers understand our vision as we are executing on it and understand their vision as they are executing on theirs. We are building the next great enterprise software company.

      Job Responsibilities
      • Develop and execute successful sales cycles around named strategic enterprise prospects that are looking to deploy next generation distributed applications at massive scale and help them understand the D2iQ approach and the D2iQ differentiation.
      • Qualify these customers in and out based on their challenges, our solutions and their willingness to partner with a growing software company.
      • Build a sustainable network that includes partners, integrators and analysts to help evangelize the D2iQ approach.
      • Build and manage a healthy pipeline of customers that want to partner with D2iQ
      • Build sustainable relationships with our prospects and customers. Understand their requirements and represent them to our product, support and executive teams to help us build world-class software.
      Skills & Requirements
      • You have a minimum of 5 years outstanding proven sales track record in the area of high-end enterprise data/datacenter platform software.
      • You have a genuine passion and understanding around how next generation data driven applications transform businesses and society.
      • You have a good general knowledge and interest in the new technologies in the field of Big Data, IoT and datacenter technology landscape.
      • If you're familiar with the open source ecosystem and the container landscape it is a huge plus.
      • You have a proven ability to effectively form, lead, and inspire highly talented virtual account teams, including partners, to enable maximum customer success and consequently amazing sales results.
      • Experience with at least one light speed growing startup; you have to be comfortable with things that break and people that love to fix things when they break.
      • You are an expert in navigating and establishing high impact trusted relationships on all levels (especially C-level) in large corporate customer environments.
      • You must have a burning desire to be successful and to see your customers and virtual teams succeed.
      About D2iQ - Your Partner in the Cloud Native Journey

      On your journey to the cloud, you need to make numerous choices—from the technologies you select, to the frameworks you decide on, to the management tools you’ll use. What you need is a trusted guide that’s been down this path before. That’s where D2iQ can help.

      D2iQ eases these decisions and operational efforts. Rather than inhibiting your choices, we guide you with opinionated technologies, services, training, and support, so you can work smarter, not harder. No matter where you are in your journey, we’ll make sure you’re well equipped for the road ahead.

      Backed by T. Rowe Price, Andreessen Horowitz, Khosla Ventures, Microsoft, HPE, Data Collective, and Fuel Capital, D2iQ is headquartered in San Francisco with offices in Hamburg, London, New York, and Beijing.

    • Oden Technologies (North America)
      1 month ago

      About Oden:

      We are on the brink of the next industrial revolution.

      Manufacturing has long been an analog world, but this is about to change. By introducing machines to the digital world, there’s a staggering opportunity for efficiency and production leaps. Oden is driving this revolution. We’re on a mission to eliminate waste in manufacturing.

      We have combined industrial hardware, wireless connectivity, and big data architecture into one simple platform so all manufacturers can analyze and optimize their production, from any device. Efficiency, sustainability, and competitiveness are democratized.

      Why We Do It:

      We like to enable those who make things - to make more, to waste less, to serve their customers, and to thrive in a competitive world. Help enough makers, and the world can give us all the abundance we want for less cost and environmental impact. We’re on the verge of a 4th industrial revolution that begs for absolute efficiency in all factors of life. We plan to deliver that to everyone who makes things.

      Oden Values:

      • We foster trust, collaboration and personal development in our team, clients, and society

      • We communicate openly in all directions, embracing failure

      • We face new challenges and information head on, everyone’s impact matters

      • We learn from our mistakes and move forward with purpose and plan

      • We are building a community of those who create and move boundaries, with ownership and passion

      The Role:

      As Director of Sales, you’ll have the opportunity to significantly impact the growth of our company and help redefine the way the manufacturing world works. In order to continue to accelerate our rapid growth, we’re looking for people who embrace aggressive goals and work hard to achieve them. You’ll handle the sales process from marketing qualified to closed won, and close deals with managers and executives of medium and large-sized industrial corporations. This individual puts the best interests of the clients first and is excited to work for one of the most innovative tech companies in IoT, Industry 4.0, and Big Data Analytics. You’ll work closely with Oden’s dedicated lead generation team, who books meetings on behalf of the Director of Sales.

      Responsibilities Include But Not Limited To:

      • Owning the entire sales funnel from lead to closed won.

      • Working with your supporting BDR to accelerate opportunity creation.

      • Going after new logos, securing pilots leading to initial roll-out, up-sell, and cross-sells

      • Achieving Deal and ARR quotas and developing future targets

      • Qualifying inbound leads and MQL’s to SQL’s to forecasted opportunities

      • Maintaining accurate pipeline and forecasts with CRM

      • Representing Oden at industry trade shows and conferences

      • Evangelizing Oden’s vision for Industry 4.0 and the progression of technological adoption in the manufacturing sector

      • Developing and executing strategies to take market share from Oden’s competitors

      • Attention to detail, tracking all sales activities and taking in depth notes for distribution to the marketing team.

      • Travel as-needed (estimating 30%) to support prospect and clients

      Minimum qualifications:

      • 5 to 10+ years of sales expertise in B2B SaaS and/or manufacturing

      • Consistent track record of meeting or exceeding aggressive quota/revenue targets

      • Understanding of the enterprise sales process and how to navigate large organizations to successfully sell high-value solutions to committees and cross-functional groups

      • Courtesy, integrity, and a constant attention to the client’s best interests

      • Ability to develop lasting business relationships with C-level decision makers

      • Experience working closely as a partner with prospective clients to understand their needs.

      • Curiosity and enthusiasm about technology

      • Exceptional communication skills

      • Organized, independent worker

      • Experience with CRM system is an asset.

      Nice to have:

      • Startup, analytics and IOT / IIOT experience

      You:

      • Care about the mission of the product and company.

      • Are never satisfied with the way things are, but excited about the way things could be.

      • Are a lifelong learner with a thirst to help grow businesses using data

      • Are a team player and can think strategically about how to communicate an idea to a market.

      • You know how to navigate a large, complex organization to establish a cohesive narrative between platform users and executive buyers.

      • Empathize with customer needs while also looking towards long-term innovation.

      • Live by transparent and scientific thinking. You put in the work to find the best ideas with those around you, without ego.

      What We Offer You:

      • Measurable impact to the world and the chance to help real people - family businesses, entrepreneurs, engineers.

      • Exposure to many tech disciplines, most of which are rapidly evolving.

      • A bridge between the physical and cloud worlds of tech. Our platform unites big data visualizations with sensors, and heavy industrial equipment.

      • A platform that has the potential to evolve beyond what we have envisioned now.

      • Scientific and transparent thinking, for everyone involved.

      • We’re an equal opportunity employer (EOE).

      Diversity at Oden means building a team that is rich across all boundaries of race, ethnicity, gender identification, sexual orientation, disability, religion, age and thinking style. We welcome all backgrounds, life experiences, and world-views as this is the catalyst for the rapid evolution of our product and our organization. Diversity allows us to tackle new challenges, embrace change, make well-informed decisions, and ultimately Make Things Better. In alignment with our “People First” company value, Oden has a passionate internal team dedicated to the promotion of diversity and inclusion initiatives as a core component of our culture.

      Our diversity initiatives apply to our practices and policies on recruiting, compensation and benefits; professional development; promotions; social activities and the ongoing development of a psychologically safe work environment.

  • Product (1) Product rss feed

    • This is an outstanding opportunity to join a rapidly growing, series A, remote-friendly fintech with a dynamic team. You will be building the future of Business Credit using cutting-edge AI/ML techniques.

      We seek an ambitious and self-motivated Head of Products to spearhead new product rollout. The ideal teammate is an experienced professional looking to take ownership of our AI financial solution.

      As a Head of Products, you will be responsible for:

      • Gather and analyze feedback from customers, development, sales, marketing, and the market
      • Create and manage the overall Product Management process
      • Build and grow the Product team and contractors to enable rapid product development
      • Lead our internal process to define the product strategy, roadmap, timeline, and priorities
      • Develop requirements documents and use cases for new product features
      • Become an expert on other products in the marketplace

      You should be:

      • Curious to learn and assimilate information quickly, enthusiastic to share and teach others.
      • Keenly analytical and neurotic about problem-solving.
      • An outstanding communicator with sound interpersonal skills.
      • Strongly interested in technology and continuous learning.
      • Able to work autonomously and resourcefully in a fast-paced startup environment.

      You should have:

      • 4+ years of product management
      • Passion for fintech
      • Recent experience in a product leadership position with responsibility for ideation, development and product definition for SaaS or Analytics solutions
      • Experience with big data/analytics and/or fintech platforms
      • Experience with agile development processes
      • “Roll-up-your-sleeves” entrepreneurial, outgoing, and startup attitude: You thrive in and enjoy a small startup company atmosphere.
      • A BS degree in a technical field and an MBA from top tier schools are highly desirable
  • All others (2) All others rss feed

    • Auth0 (North America)
      1 week ago
      Auth0 is a pre-IPO unicorn. We are growing rapidly and looking for exceptional new team members to add to our teams and will help take us to the next level. One team, one score. 

      We never compromise on identity. You should never compromise yours either. We want you to bring your whole self to Auth0. If you’re passionate, practice radical transparency to build trust and respect, and thrive when you’re collaborating, experimenting and learning – this may be your ideal work environment.  We are looking for team members that want to help us build upon what we have accomplished so far and make it better every day.  N+1 > N.

      Auth0 is looking for a Data Scientist to join our Growth Marketing team.  This role will blend research and application to develop and apply data-centric models that support all aspects of marketing at Auth0.  Additionally, this role will collaborate directly with analysts and business owners in these functional areas to understand their challenges and develop practical solutions.  If you love to work with big data, scaled optimization, probabilistic inference, and machine learning this is the right role for you! 

      The Growth team is an innovative and forward thinking team of analysts and engineers working to impact Auth0’s marketing funnel and user engagement.  This is an individual contributor role but will act with substantial independence in both technical model-building and stakeholder collaboration.
      This role can be based from our Bellevue, WA office or from a remote home office anywhere in North America.

      What You Will Do
      • Work with Growth Marketers to understand best practices and propose strategic ideas and turn them into initiatives for driving growth
      • Define and regularly monitor KPIs, success metrics, and other analytics to maximize our conversion rate across their digital channels.
      • Build models, such as LTV, and help drive strategic and operational changes across all marketing initiatives
      • Collaborate with internal marketing and other cross-company teams to understand challenges and create data-focused solutions
      • Stay abreast of technology trends and best practices in data modeling 
      • Apply creative analytical problem-solving skills to a wide variety of marketing questions to deepen our understanding of campaign effectiveness, customer journeys, and go-to-market performance
      • Collaborate with leadership in marketing and cross-functionally to provide a data-driven viewpoint on strategic decisions
      What You Bring
      • 5+ years of relevant analytics/data science experience in a technology company with at least 3 years leading innovative projects dealing with applications of analytics/data science
      • Deep understanding of multi-channel digital marketing
      • Strong communication ability across multi-functional teams for technical and non-technical audiences
      • Be the expert on emerging or existing Data Science technologies and techniques to enhance Auth0’s marketing efficiency
      • Assist in designing a data and computational infrastructure that can handle near real time model execution, perform machine learning, and batch large scale data for our users including data pipelines, training environments and decreasing production runtime
      • Understanding of computer science fundamentals, data structures, and algorithms. In-depth knowledge of one or more programming languages (including Java, C/C++, Python)
      • Experience in specialized areas such as Optimization, NLP, Probabilistic Inference, Machine Learning, Recommendation Systems
      • Proven experience with large data sets and related technologies, e.g., Hadoop/Spark. Knowledge of SQL is needed
      • Experience with LTV/CAC models, Marketing ROI and Performance management, Attribution models, digital and paid media analysis
      Auth0’s mission is to help developers innovate faster. Every company is becoming a software company and developers are at the center of this shift. They need better tools and building blocks so they can stay focused on innovating. One of these building blocks is identity: authentication and authorization. That’s what we do. Our platform handles 2.5B logins per month for thousands of customers around the world. From indie makers to Fortune 500 companies, we can handle any use case.

      We like to think that we are helping make the internet safer.  We have raised $210M to date and are growing quickly. Our team is spread across more than 35 countries and we are proud to continually be recognized as a great place to work. Culture is critical to us, and we are transparent about our vision and principles

      Join us on this journey to make developers more productive while making the internet safer!
    • Kalepa is looking for Data Scientists to lead efforts at the intersection of machine learning and big data engineering in order to solve some of the biggest problems in commercial insurance.

      Data scientists at Kalepa will be turning vast amounts of structured and unstructured data from many sources (web data, geolocation, satellite imaging, etc.) into novel insights about behavior and risk. You will be working closely with a small team in designing, building, and deploying machine learning models to tackle our customers’ questions.

      Kalepa is a New York based, VC backed, startup building software to transform and disrupt commercial insurance. Nearly one trillion ($1T) dollars are spent globally each year on commercial insurance across small, medium, and large enterprises. However, the process for estimating the risk associated with a given business across various perils (e.g. fire, injury, malpractice) is still reliant on inefficient and inaccurate manual forms or outdated and sparse databases. This information asymmetry leads to a broken set of economic incentives and a poor experience for both businesses and insurers alike. By combining cutting edge data science, enterprise software, and insurance expertise, Kalepa is delivering precision underwriting at scale – empowering every commercial insurance underwriter to be as effective and efficient as possible. Kalepa is turning real-world data into a complete understanding of risk.

      Kalepa is led by a strong team with experiences from Facebook, APT (acquired by Mastercard for $600M in 2015), the Israel Defense Forces, MIT, Berkeley, and UPenn.

      About you:

      ● You want to design a flexible analytics, data science, and AI framework to transform the insurance industry

      ● You have demonstrated success in delivering analytical projects, including structuring and conducting analyses to generate business insights and recommendations

      ● You have in-depth understanding of applied machine learning algorithms and statistics

      ● You are experienced in Python and its major data science libraries, and have deployed models and algorithms in production

      ● You have a good understanding of SQL and non-SQL databases

      ● You value open, frank, and respectful communication

      ● You are a proactive and collaborative problem solver with a “can do” attitude

      ● You have a sincere interest in working at a startup and scaling with the company as we grow

      As a plus:

      • You have experience in NLP and/or computer vision

      • You have familiarity with Spark, Hadoop, or Scala

      • You have experience working with AWS tools

      What you’ll get

      ● Work with an ambitious, smart, and fun team to transform a $1T global industry

      ● Ground floor opportunity – opportunity to build the foundations for the product, team, and culture alongside the founding team

      ● Wide-ranging intellectual challenges working with large and diverse data sets, as well as with a modern technology stack

      ● Competitive compensation package with a significant equity component

      ● Full benefits package, including excellent medical, dental, and vision insurance

      ● Unlimited vacation and flexible remote work policies

      ● Continuing education credits and a healthy living / gym monthly stipend

      [IMPORTANT NOTE]: Salary ranges are for New York based employees. Compensation for remote roles will be adjusted according to the cost of living and market in the specific geography.