Remote tensorflow Jobs in February 2020

6 Remote tensorflow Jobs in February 2020

Post a job
  • Software Development (1)

    • TileDB (US or Greece)
      3 days ago

      We are looking for a Python-focused software engineer to build and enhance our existing APIs and integrations with the Scientific Python ecosystem. TileDB’s Python API (https://github.com/TileDB-Inc/TileDB-Py) wraps the TileDB core C API, and integrates closely with NumPy to provide zero-copy data access. You will build and enhance the Python API through interfacing with the core library; build new integrations with data science, scientific, and machine learning libraries; and engage with the community and customers to create value through the use of TileDB.

      Location

      Our headquarters are in Cambridge, MA, USA and we have a subsidiary in Athens, Greece. However, you will have the flexibility to work remotely as long as your residence is in the USA or Greece. US candidates must be US citizens, whereas Greek candidates must be Greek or EU citizens.

      Expectations

      In your first 30 days, you will familiarize yourself with TileDB, the TileDB-Py API and the TileDB-Dask integration. After 30 days, you will be fully integrated in our team. You’ll be an active contributor and maintainer of the TileDB-Py project, and ready to start designing and implementing new features, as well as engaging with the Python and Data Science community.

      Requirements

      • 5+ years of experience as a software engineer
      • Expertise in Python and experience with NumPy
      • Experience interfacing with the CPython API, and Cython or pybind11
      • Experience with Python packaging, including binary distribution
      • Experience with C, C++, Rust, or a similar systems-level language
      • Distributed computation with Dask, Spark, or similar distributed computation system
      • Experience with a machine learning library (e.g. scikit-learn, TensorFlow, Keras, PyTorch, Theano)
      • Experience with Amazon Web Services or a similar cloud platform
      • Experience with dataframe-focused systems (e.g. Arrow, Pandas, data.frame, Vaex)
      • Experience with technical data formats such as (e.g. Parquet, HDF5, VCF, DICOM, GeoTIFF)
      • Experience with other technical computing systems (e.g. R, MATLAB, Julia)

      Benefits

      • Competitive salary and stock options
      • 100% medical and dental insurance coverage (for you and your dependents!)
      • Paid parental leave
      • Paid time off (vacation, sick & public holidays)
      • Flexible time off & flexible hours
      • Flexibility to work remotely (anywhere in the US or Greece)

      TileDB, Inc. is proud to be an Equal Opportunity Employer building a diverse and inclusive team.

  • Product (2)

    • D2iQ is looking for an experienced Product Manager that can lead some of the strategic initiatives around Kubernetes and data services. You will collaborate with customers, the open-source community, partners, engineering, marketing and other functions to build a great product and make it successful in the market. If you’re passionate about product, can identify patterns from customer needs, and create well-defined requirements and user stories to help engineers deliver fantastic software solutions, come join us!

      Our headquarters is in San Francisco, CA but we're open to remote candidates in the United States or Germany.

      Job Responsibilities
      • Define strategy and drive execution of cloud operations capabilities for D2iQ's strategic Kubernetes initiatives and existing product.
      • Own and prioritize the backlog; participate with engineering in sprint planning to represent customer requirements to ensure we build the right solution
      • Work closely with customers to understand their needs and validate product direction
      • Define features, user stories, requirements and acceptance criteria
      • Deploy, use and test your own product to accept and provide early feedback within the development lifecycle
      • Work with all other functions to enable, market and advocate your product.
      Skills & Requirements
      • Experience working with two or more of the following open-source technologies: Kafka, Cassandra, Spark, HDFS, Elasticsearch, Tensorflow, Jupyter, Kubernetes
      • Knowledge of the datacenter infrastructure market and current trends
      • Strong understanding of Distributed Systems: Install, Upgrade, Backup / Restore, Compatibility Matrix, OS Support, Logging, Metrics, UI, CLI, Telemetry, etc...
      • Strong understanding of the Cloud Service Provider and marketplace offering integration: AWS, Azure, GCP
      • Technical understanding in one or more of: containerization, virtualization, networking, storage, security, operating systems
      • Proven track record of shipping successful enterprise software products is a must
      • Master of lean product development methods and worked with Jira before
      • Data-driven decision maker
      • Detail oriented and passionate about a great user experience combined with the ability to back proposed decisions by data
      • Superb communication and presentation skills
      • Minimum 3-5 years of experience as a Product Manager
      • Preference for candidates based in the San Francisco Bay Area but remote applications in the US will be considered
      About D2iQ - Your Partner in the Cloud Native Journey

      On your journey to the cloud, you need to make numerous choices—from the technologies you select, to the frameworks you decide on, to the management tools you’ll use. What you need is a trusted guide that’s been down this path before. That’s where D2iQ can help.

      D2iQ eases these decisions and operational efforts. Rather than inhibiting your choices, we guide you with opinionated technologies, services, training, and support, so you can work smarter, not harder. No matter where you are in your journey, we’ll make sure you’re well equipped for the road ahead.

      Backed by T. Rowe Price, Andreessen Horowitz, Khosla Ventures, Microsoft, HPE, Data Collective, and Fuel Capital, D2iQ is headquartered in San Francisco with offices in Hamburg, London, New York, and Beijing.

    • 2 months ago

      D2iQ (formerly Mesosphere) is looking for an experienced Product Manager that can lead some of the strategic initiatives around Kubernetes and data services, as well as own the core of D2iQ's DC/OS platform. You will collaborate with customers, the open-source community, partners, engineering, marketing and other functions to build a great product and make it successful in the market. If you’re passionate about product, can identify patterns from customer needs, and create well-defined requirements and user stories to help engineers deliver fantastic software solutions, come join us!

      Our headquarters is in San Francisco, CA but we're open to remote candidates in the United States or Germany.

      Responsibilities
      • Define strategy and drive execution of cloud operations capabilities for D2iQ's strategic Kubernetes initiatives and existing product.
      • Own and prioritize the backlog; participate with engineering in sprint planning to represent customer requirements to ensure we build the right solution
      • Work closely with customers to understand their needs and validate product direction
      • Define features, user stories, requirements and acceptance criteria
      • Deploy, use and test your own product to accept and provide early feedback within the development lifecycle
      • Work with all other functions to enable, market and advocate your product.
      Requirements
      • Experience working with two or more of the following open-source technologies: Kafka, Cassandra, Spark, HDFS, Elasticsearch, Tensorflow, Jupyter, Kubernetes
      • Knowledge of the datacenter infrastructure market and current trends
      • Strong understanding of Distributed Systems: Install, Upgrade, Backup / Restore, Compatibility Matrix, OS Support, Logging, Metrics, UI, CLI, Telemetry, etc...
      • Strong understanding of the Cloud Service Provider and marketplace offering integration: AWS, Azure, GCP
      • Technical understanding in one or more of: containerization, virtualization, networking, storage, security, operating systems
      • Proven track record of shipping successful enterprise software products is a must
      • Master of lean product development methods and worked with Jira before
      • Data-driven decision maker
      • Detail oriented and passionate about a great user experience combined with the ability to back proposed decisions by data
      • Superb communication and presentation skills
      • Minimum 3-5 years of experience as a Product Manager
      • Preference for candidates based in the San Francisco Bay Area but remote applications in the US will be considered
      D2iQ - Your Partner in the Cloud Native Journey

      On your journey to the cloud, you need to make numerous choices—from the technologies you select, to the frameworks you decide on, to the management tools you’ll use. What you need is a trusted guide that’s been down this path before. That’s where D2iQ can help.

      D2iQ eases these decisions and operational efforts. Rather than inhibiting your choices, we guide you with opinionated technologies, services, training, and support, so you can work smarter, not harder. No matter where you are in your journey, we’ll make sure you’re well equipped for the road ahead.

      Backed by T. Rowe Price, Andreessen Horowitz, Khosla Ventures, Microsoft, HPE, Data Collective, and Fuel Capital, D2iQ is headquartered in San Francisco with offices in Hamburg, London, New York, and Beijing.

  • All others (3)

    • 2 weeks ago
      We are looking for a talented Data Scientist to join our team at Prominent Edge. We are a small company of 24+ developers and designers who put themselves in the shoes of our customers and make sure we deliver strong solutions. Our projects and the needs of our customers vary greatly; therefore, we always choose the technology stack and approach that best suits the particular problem and the goals of our customers. As a result, we want developers who do high-quality work, stay current, and are up for learning and applying new technologies when appropriate. We want engineers who have an in-depth knowledge of Amazon Web Services and are up for using other infrastructures when needed. We understand that for our team to perform at its best, everyone needs to work on tasks that they enjoy. Most of our projects are web applications which and often have a geospatial aspect to them. We also really take care of our employees as demonstrated in our exceptional benefits package. Check out our website at http://prominentedge.com for more information and apply through http://prominentedge.com/careers.

      Ideal candidates are those who can find value out of data.  Such a person proactively fetches information from various sources and analyzes it for a better understanding of the problem, and may even build AI/ML tools to make insights. The ideal candidate is adept at using large datasets to find the right needle in a pile of needles and uses models to test the effectiveness of different courses of action. Candidates must have strong experience using a variety of data mining/data analysis methods, using a variety of data tools, building and implementing models, using/creating algorithms and creating/running simulations. They must have a proven ability to drive results with their data-based insights. They must be comfortable working with a wide range of stakeholders and functional teams. The right candidate will have a passion for discovering solutions hidden in large datasets and working with stakeholders to improve mission outcomes.  A successful candidate will have experience in many (if not all) of the following technical competencies including: statistics and machine learning, coding languages, databases, and reporting technologies.

      Required Skills
      • Bachelor's Degree in Computer Science, Information Systems, Engineering or other related scientific or technical discipline.
      • Proficient in data preparation, exploration, and statistical analysis
      • Proficient in a programing language such as Python, Rm Julia, or JavaScript
      • Experience with batch scripting and data processing
      • Experience with Machine Learning libraries and frameworks such as TensorFlow/Pytorch or Bayesian Analysis using SAS/R Studio.
      • Experience with databases such as Postgres, Elasticsearch, MongoDB, or Redis



      Desired Skills
      • Master's degree in Computer Science or related technical discipline.
      • Experience with natural language processing, computer vision, or deep learning
      • Experience working with geospatial data
      • Experience with statistical techniques
      • Experience as either back-end or front-end/visualization developer
      • Experience with visualization and reporting technologies such as Kibana or Tableau



      W2 Benefits
      • Not only you get to join our team of awesome playful ninjas, we also have great benefits:
      • Six weeks paid time off per year (PTO+Holidays).
      • Six percent 401k matching, vested immediately.
      • Free PPO/POS healthcare for the entire family.
      • We pay you for every hour you work. Need something extra? Give yourself a raise by doing more hours when you can.
      • Want to take time off without using vacation time? Shuffle your hours around in any pay period.
      • Want a new MacBook Pro laptop? We'll get you one. If you like your MacBook Pro, we’ll buy you the new version whenever you want.
      • Want some training or to travel to a conference that is relevant to your job? We offer that too!
      • This organization participates in E-Verify.
    • 1 month ago

      At phData, we believe that we are witnessing a new age of digitization and decision automation. To compete, even traditional companies must deliver machine learning and analytics-enabled data platforms and data products. Examples include a medical device manufacturer building machine learning models to guide therapies or a global industrial manufacturer using deep learning models to identify product defects. This means that the data problems that used to apply only to Google and Facebook, now apply to a filter manufacturer, a medical device manufacturer, or a healthcare provider. 

      phData matches machine learning and analytics to the business goals of the world's largest and most successful companies. phData is one of the fastest-growing companies in the U.S. and demand for our services and software has skyrocketed. This has resulted in quality growth and an expanded presence at our company headquarters located in Downtown Minneapolis, as well as in Bangalore, India, and across the U.S. We’ve been recognized as one of the Best Places to Work in Minneapolis for three consecutive years and we were listed as the 48th fastest growing private company in the U.S. on the Inc. 5000 list.

      We pride ourselves on offering employees phenomenal growth and learning opportunities in addition to competitive compensation, health insurance, generous PTO and excellent perks including extensive training and paid certifications.

      • As a Data Scientist, your responsibilities include:


        • Design, implement, and train algorithms and models that improve and streamline business processes

        • Design, analyze, and interpret the results of experiments

        • Author reports, visualizations, and presentations to communicate your process and results

        • Work closely with the development teams including data engineers and machine learning engineers to build lasting solutions

        • Individually engage with customers and take responsibility for implementing data science best practice in their work

        • Act as a thought leader; give talks, contribute to open source projects, and advanced data science practices on a global scale

        Qualifications

        • Advanced degree or evidence of exceptional ability in engineering, computer science, mathematics, physics, chemistry, or operations research
        • 3-5+ years hands-on experience in building models and developing algorithms for real-world applications and data-driven optimizations
        • Depth of knowledge in advanced mathematics, machine learning, and statistics
        • Strong computer science fundamentals: data structures, algorithms, distributed systems
        • Mastery of one or more data analysis languages such as R, Python, SQL, Matlab and/or SAS.
        • Experience with data science tools including Python scripting, numpy, scipy, matplotlib, scikit-learn, jupyter notebooks, bash scripting, Linux environment
        • Experience with at least one mainstream deep learning frameworks, e.g. TensorFlow, PyTorch, Caffe(2), MXNet.
        • Exemplary verbal and written communication skills
        • Able to work under pressure while managing competing demands and tight deadlines
        • Well organized with meticulous attention to detail

    • As a data scientist you will push the boundaries of deep learning & NLP by conducting research in cutting-edge AI and apply that research to production scale data. Our data includes a large corpora of global patents in multiple languages and a wealth of metadata. The focus of your work will be to analyze, visualize, and interpret large text corpora as well as more traditional structured data. This includes recommending and implementing ML-based product features. We also expect to publish primary research and contribute to FOSS that we use.

      This is an exciting opportunity to part of a startup that is applying deep learning to a real-world problem at global scale. We’re always looking for leaders and there is room to grow for the right person to take increasing responsibility working as part of a small and dynamic team.

      Location:

      Tokyo, Melbourne, San Francisco (willing to negotiate remote work as well)

      Responsibilities:

      • Architect and implement software libraries for batch processing, API-based predictions, and static analyses

      • Rapidly iterate on the design, implementation and evaluation of machine learning algorithms for document corpora

      • Report and present software developments including status and results clearly and efficiently, verbally and in writing

      • Participate in strategy discussions about technology roadmap, solution architecture, and product design

      • Strict adherence to clean code paradigms

      Minimum Qualifications and Education Requirements:

      • BSc/BEng degree in computer science, mathematics, machine learning, computational linguistics or equivalent (MSc/MEng preferable)

      • Experience with implementing statistical methods and data visualization

      • Good knowledge of computer science principles underpinning the implementation of machine learning algorithms

      • Experience with deep learning approaches to NLP, particularly RNNs

      • Experience implementing deep learning models in TensorFlow or PyTorch 

      Preferred Qualifications:

      • Contributions to open source projects

      • Passion for new developments in AI

      • Experience with GCP or AWS

      • A track record of machine learning code that is:

      -Well documented

      -Well commented

      -Version controlled

      -Unit tested

      To apply, please contact us at [email protected] with your CV.