This month - Remote Apache jobs
  • Bellroy
    Required: US only.


    Help us make better decisions by getting our (many) data pipelines flowing smoothly and sensibly into a well-architected warehouse. Bellroy loves to balance art with science; using data to inform and test our theories, and iterating towards Truth.

    We need your help to build, improve and maintain the infrastructure required to extract, transform, load and enrich data from a variety of sources. With your help we’ll be able to get the right info, infer useful things and make better decisions. Then test, and keep improving.

    When the warehouse is fully functional, you’ll lend your sharp logic to improve internal processes, automate systems, and optimise, well, everything. And down the track we’d love you to get involved with some Bayesian analysis, machine learning and other interesting data team projects.

    If you get excited by the idea of providing the right data to inform great decisions; and you want that data to be accessible, understandable and trustworthy, then this could be the job for you. If you bring your experience, smarts and detail-oriented brain to help us, we’ll offer a world-class team to learn from, the tools you need to do your thing, and the support you need to flourish.


    Your logic and reasoning is formidable, as is your determination. You are ordered and decisive, yet creative at the same time. You like things to end up fitting neatly in the boxes; but you don’t mind looking outside of them in the first place to find the answers. You get a kick out of getting things right, and you have the patience to do that every time. Because you present your work with pride, knowing you’ve done all you can to make sure it’s correct – and useful.

    Disorder makes you uncomfortable. With well ordered data in a clean schema, life is good. But you love a challenge more than most other things, so where you see inconsistency or confusion, you also see an opportunity. And you’ll follow that thread until you’re confident you’ve sorted it out.

    You might have been called a “bookworm” more than once in your life, because your curiosity has you forever seeking (and absorbing) information. And nowadays, this natural thirst for knowledge has you constantly looking at ways to improve, optimize and enhance – yourself, your processes, and your data. A bit like a mechanic will with a car, you’ll break apart the whole into pieces you can check and analyse, before rebuilding an idea, process or system into something better.

    Sound familiar? If so, we’d love to meet you (and that brain of yours).


    • Reviewed our overall data systems (supported by very competent sysadmins) to make sure everything was in order and to look for larger scale improvements
    • Built out a handful of new pipelines to bring more of our core business data into our data warehouse
    • Chased a handful of data validation alerts raised by our pipelines, and taken the time to get to the root cause of each of them, then either delegated the fix to an appropriate someone else or fixed them yourself
    • Suggested an improvement to our A/B testing infrastructure
    • Worked outside of data team, with our developers, flexing your database and query optimisation skills to decide whether to fix a performance issue they’re having at the database level, or insist that the fix should be in the code (and, that’s fun - they’re an excellent bunch)
    • Provided an ad-hoc analysis (working with our analysts) to someone who requested it, integrating a one-off data source
    • Talked with our CIO about some of our mid-term plans, and how we’ll support them with data


    • At least three years experience in data-related roles
    • Advanced working knowledge of SQL and experience in ETL using a workflow management tool such as Apache Airflow
    • Experience with building and optimising data pipelines
    • Experience with collecting data from a variety of sources including APIs (good APIs, bad APIs, and ugly APIs)
    • Strong analytical skills and an ability to perform root cause analysis
    • Training in Computer Science, Statistics, Informatics, Information Systems or another relevant quantitative field (or demonstrable skill in one of those areas and the story of how you built that skill without formal training)
    • Very high precision – you need to know how to verify that your work is correct
    • Bonus points for more experience with relevant programming languages (ie. Ruby on RAILS, Python, R, Scala), PostgreSQL, project management and machine learning.
  • Webinterpret
    PROBABLY NO LONGER AVAILABLE.Required: Europe only.

    Senior Python Developer

    Location: Remote

    Employment: about 160 hours per month on the contract


    As a Senior Developer, you will be a part of a multi-disciplinary agile team that is responsible for developing services and plugins to process and localize large amounts of e-commerce listings and run thousands of localized E-commerce sites.

    • You will work on the development of our internal micro-services & APIs to support a flow of millions of products and orders between domestic and international stores
    • You will improve the performance and scalability of our services
    • You will breathe and live by Test Driven Development to outsmart your QA colleagues
    • You will continually seek to develop your skills, learning tools and technologies that help you master your profession
    • You always stand up to the challenges your software may present to the not so amused customer, be the light in the tunnel, not just at the end of it.
    • You influence your peers, your stakeholders to design a top-notch solution for every problem space you put your hands on.
    • Help with architecture-level design decisions during various phases of a project.



    • 5+ years of recent production quality Python product development experience.

    • Experience in designing highly scalable web applications

    • Experience developing and maintaining complex web architectures

    • experience with the following technologies:

    • Git

    • SQL databases such as MySQL, PostgreSQL, or similar

    • NoSQL databases such as MongoDB, Memcached, or similar

    • strong analytical skills

    • Experience in SOA and message queues (like RabbitMQ)

    • good written and spoken English, comfortable with working in an international environment

    • Experience in scalability and caching techniques

    • Knowledge of TDD

    • Knowledge of REST (Swagger)

    • Having contributed to Open Source ( send us your GitHub id!)

    • Basic DevOps skills (*nix, Apache/nginx)

    • Some experience with Docker

    • Knowledge of e-commerce platform (e.g. Shopify, WooCommerce, Magento)

    • Knowledge of Amazon Web Services (EC2, RDS, ELB, EBS)

    • Experience in ElasticSearch

    • Experience working in an Agile environment

    **WE OFFER


    • The chance to work on large-scale exciting products
    • The opportunity to work as part of a dynamic, experienced, international team
    • Attractive remuneration
    • Team structure that allows working remotely
    • Referral program

    Interested? Send cv to our recruiter:

    or apply directly:

  • DealTracker

    Here's to the crazy ones. The hackers. The doers. The curious geeks in a world of corporate zombies…

    A cool, fully-remote startup is looking for a Full-Stack Engineer… preferably one that does NOT suck! You must be speaking Python better than your mother tongue, and able to do the work of both a backend developer and a data engineer.

    On top of the salary, you'll get stock options, performance-based bonuses, and annual profit share, as well as extensive training and mentoring, BUT…

    You must be an absolute perfectionist — you're simply too passionate about your work to call something "done" when it's not near perfect yet!

    Do you remember how "Monica" from F.R.I.E.N.D.S was obsessed with the little details? Now, imagine if she became a software engineer somehow… Do you think this is you?

    Okay, we want to hire you if you…

    • have rock-solid experience in building data-intensive web applications
    • are proficient with Python2.7 and Python3 alike
    • have good knowledge of Python's standard library
    • know Django/DRF
    • know your way around AWS
    • know how to BigData
    • have ninja-level skills with SQL (PostgreSQL/MySQL)
    • worked with ETL and data pipelines before
    • have advanced skills when it comes to web scraping
    • are proficient with UNIX and Shell Scripting

    It would be nice if you…

    • have experience with any of these technologies: ElasticSearch, SQLAlchemy, Selenium, Golang, Elixir, Apache Kafka, React Native, or Serverless/AWS Lambda
    • have DevOps experience
    • worked on recommender systems of any kind
    • are not afraid of frontend work; React.js, JavaScript, HTML, & CSS
    • are a fan of Pink Floyd

    On top of that, you…

    • are passionate about making a difference in an early-stage startup with a kickass product
    • like hacking pet projects, just for fun and kudos
    • can work in a fully-remote environment and embrace asynchronous communication
    • document and test your code, and you are familiar with continuous delivery
    • are familiar with Agile methodologies
    • have Sherlock Holmes-like detective skills; you know how to dive deep into data investigations to identify unknown problems and debug data anomalies.

    Your typical day at DealTracker

    You will be working in a small team of A-players, reporting directly to our Tech Lead, and our CEO. In a typical day, you will:

    • build and scale new features on both the backend and data engineering sides
    • participate in the design and architecture of new features
    • integrate third-party APIs and services
    • build and test data processing pipelines
    • take care of your personal staging environment

    So, what will we build together?

    DealTracker is a real-time deal aggregator that collects deals from all the major online retailers in the US and ranks them based on current and historical prices, product reviews, brand reputation, etc.

    It's also a social platform for curated shopping, where people can create shareable product collections in a Pinterest-like fashion. They can also follow their favorite brands and product categories for personalized deal discovery.

    We didn't launch the product yet, but you can demo our old PoC here to get an idea.

    Are you the real deal? Let's talk!

    If any of these things fall into your area of expertise and you want to join a kickass team of A-player hackers, now is the time to apply…

    Fill this quick form to apply.

    We would love to hear about your experience and the coolest projects you've contributed to.

    We are very flexible on timezones. This job should fit nicely with your typical workday if you're from North America, Europe, or Africa.

    Our process is fast; We're doing interviews this week, and by next Friday, we will choose the best two applicants to join us.

Older - Remote Apache jobs
  • MD Ranger, Inc.

    MD Ranger is a growing, profitable and dynamic healthcare SaaS company serving hospitals across the U.S. We are the leading supplier of non-salary physician compensation benchmarks. We are seeking a Full Stack Engineer to become a key part of our team in the development of new and innovative benchmarking, analytic, and visualization tools.

    In this role, you will develop new products, improve existing features, and ensure the quality of both our data and our user experience. You will work on web tools, database systems, and proprietary report writers. In a given week, you may collaborate with leadership to discuss new product opportunities, work with another senior engineer to improve speed and functionality, integrate new features into our customer portal, and develop new ways to visualize benchmark data.

    The successful candidate should be…

    • Experienced and comfortable working independently and owning their product

    • Broadly experienced with demonstrated depth in a few areas; curious and interested to learn new things

    • Secure as an expert in their technical domain and eager to learn the product domain

    • Incredibly detail-oriented, holding self and peers to high standards of quality

    • Eager to be part of a team, with strong communication and collaboration skills


    • Experience with advanced user interface designs that present complex, large datasets in a simple and understandable way

    • At least 6 years of professional work experience with Ajax, PHP, MySQL, JavaScript, Git, jQuery, jQueryUI.

    • System administration of cloud-based production systems including MySQL, Apache, Linux (Centos, Ubuntu; ensuring continuous service).

    • Design and programmatic generation of PDF and web-based documents

    • Understanding of basic statistics, algorithms and data structures, performance tuning

    • Normalization and interpretation of unstructured data

    • Experience implementing procedures around security and privacy

    • Acquaintance with Ruby and Joomla system administration is a plus

    To apply, send your resume and cover letter to [email protected]

  • EasyPractice
    PROBABLY NO LONGER AVAILABLE.Required: Europe only.

    EasyPractice is looking for a Full Stack Developer with a keen eye for UI and UX.

    EasyPractice is a SaaS-platform that enables clinicians and therapists to handle all their administrative work. We're growing steadily and have more than 19.000 users on our platform (primarily spread across Denmark, Sweden, Norway, and the UK)

    We are looking for a developer who knows his/her way around Laravel and Vue.js but, most importantly, have a passion and an understanding of how to implement these with the end-user in mind. 

    We do not require you to have expert-level knowledge of Laravel but you should know you're way around the framework. You'll be primarily working in Vue.js and sketching out solutions for new features. A typical task could be:

    1) Our users tell us they want to be able to put their clients on waiting lists.

    2) You figure out what the user's specific needs are.

    3) You implement the feature as a prototype in Vue.js.

    4) You finish up the feature with a backend-developer who does most of the backend.

    Here is a short summary of the tech-stack we use at EasyPractice: PHP, Laravel, Vue.js, Sass, MySql, Apache, GitHub, Slack.

    EasyPractice is a remote-first company and we try to structure ourselves so it doesn't matter where in the world you are. We have a co-working space ( in Copenhagen where you can work from (you'll get your own desk and a great lunch if you choose to), but employees can essentially work from anywhere. This means that most of our communication will happen in Slack where we write, call and share our screen when we need to talk. 

    🕒 You get the freedom to decide your own working hours 

    ✔️ You'll have a lot of say in the final outcome of the stuff we develop

    🌍 You'll be able to work from anywhere

    We hope you'll consider applying for this position 🙂

  • RealScout

    REALSCOUT | Senior Data Engineer / Data Architect - Data Pipeline | REMOTE (minimum 5-hour overlap with Pacific US Timezone) | Full-Time

    This role is specifically to work on our data pipeline - the core of our technology. We're flexible on title; the only hard requirement is that you're senior in experience.  The pipeline is responsible for providing agents, brokers, and homebuyers real estate updates from 100+ nation-wide data feeds as quickly as possible. We’re looking for someone with at-scale experience to make improvements in and to the pipeline’s architecture -- currently Apache Airflow, Golang, Python, AWS, and Postgres, but flexible. Deployment, logging, metrics collection, SLA improvements: everything is fair game!

    A typical week will entail:

    • Ensuring perfect replication of 100+ real estate data feeds with as little lag as possible
    • Scaling a daily emailer from 100k to 1m personalized sends
    • Expanding our set of attributes that no one else in the industry has, like "stainless steel appliances" and "near Google shuttle stops"


    • Experience with medium-to-large data pipelines: implementing, testing, instrumenting, and deploying
    • Experience with stream processing tools such as Kafka, Kinesis, Spark, Storm, and/or Flink
    • Familiarity with Python+Go (bonus points for Ruby, which the main website runs)
    • Familiarity with automated unit and integration testing
    • Experience with a wide variety of data stores such as PostgreSQL, ElasticSearch, and Redshift
    • Experience with one major cloud provider (Google, Azure, AWS). AWS a plus.


    After you submit an application, if it looks like there's a good fit, we'll reach out to schedule an initial 20 minute conversation for introductions and to answer your questions about RealScout. In the meantime, visit for more info.

  • ClickMagick
    PROBABLY NO LONGER AVAILABLE.$80,000.00 - $150,000.00.

    Job Description

    This SaaS business provides an advertising tracking and marketing optimization platform used by 1,000s of small businesses to make important marketing decisions and ultimately grow their business online.

    In this position your primary role will be as a Linux Systems and Database administrator, working alongside our current sysadmin to maintain, monitor and continually improve the performance and high-availability of our front-end, application and database servers. You'll also help to create tools and automate tasks that make life easier for the rest of the team.

    Skills, Experience & Requirements

    • 5+ years Linux sysadmin (we use CentOS)
    • 5+ years Apache/Nginx (we also use mod_perl, lua and OpenResty)
    • 5+ years MySQL DBA (we use Percona MySQL)
    • 5+ years Perl scripting/development
    • You will be part of our on-call rotation

    While the primary current requirement is that of a systems and database administrator, there's also lots of opportunity to expand into other areas as well including backend development/DevOps.

    We're still a small team. You must be self-motivated and eager to work independently from your location. If you need a boss watching over you cracking the whip, someone to review everything you do, or you’re a social butterfly, then I'm sorry but this is definitely not for you.

    Ridiculous attention to detail is a must, and you should also be reliable to a fault.

    While you'll be able to work from home, your favorite coffee shop or anywhere else you want, ideally we're looking for someone in the Austin, TX area. All applications will be considered, but if you're in the Austin, TX area or willing to relocate to this fabulous city that will be a HUGE plus.

  • Action Verb

    Are you an expert in the inner workings of the FTP protocol or SFTP protocol? Would building a server that achieves perfect compatibility with the RFC while still managing to support buggy clients make you smile?

    If so, we’d like you to learn about! is one of the largest providers of commercial FTP and SFTP server hosting in the world. Our service was originally launched as BrickFTP, and even though we’ve grown beyond FTP in our service offerings, we have thousands of businesses that rely on our services 24/7 for reliable FTP and SFTP server hosting.

    Our FTP and SFTP server code is primarily written in Java and leverages open source code from the Apache foundation (that we have considerably extended).

    We’re looking for a rare unicorn who knows FTP and/or SFTP at a protocol level and knows Java to help us modernize our FTP and SFTP server code and implement many new exciting features.

    As an FTP or SFTP server developer at, you’ll take ownership of our FTP and SFTP server code, helping us keep it modern, fast, and maintainable, while implementing exciting new features.

    These codebases are currently written in Java, and we want applicants to be comfortable in Java, but we are also interested in rewriting these in more modern languages such as Go or Elixir, especially if we can leverage existing FTP or SFTP libraries.

    In this role, you’ll work on the following things and more:

    • Keeping our SSH and TLS cryptography up to date, including integrating new ciphers and curating the list of approved ciphers for maximum security.

    • Ensuring that our FTP and SFTP server products are maximally compatible with the wide range of client software in the wild (even the buggy software).

    • Optimizing our FTP and SFTP servers for speed and throughput so our customers get the fastest server experience possible.

    • Integrating new features such as client-side encryption.

    Minimum Qualifications:

    • 5+ years of directly applicable experience.

    • Deep understanding of the FTP and/or SFTP protocols at the protocol level. Having written an FTP or SFTP client or server in the past would be a strong indicator of this.

    • Strong Computer Science background and understanding of algorithms and data structures.

    • History developing and supporting actual applications that have seen production usage with a large userbase.

    • Complete understanding of how to write secure code and an awareness of common web application security vulnerabilities.

    • Solid experience with Java, ideally the new features in Java 8 and Java 10.

    Preferred Qualifications:

    • Participation in open source projects, including ideally being the maintainer for a package that sees large usage in the community.
  • SecurityTrails

    We are looking for a Lead Data Scientist to build a technical team and help us gain useful insight out of raw data as well as automate the creation and retrieval of the data.

    Your ultimate goal will be to help improve our products and business decisions by making the most out of our data, finding creative ways to improve and obtain new data, and helping to build out our incredible data team.

    Your responsibilities:

    •    Manage a team of data scientists, machine learning engineers and    big data specialists

    •    Lead data mining and collection procedures

    •    Ensure data quality and integrity

    •    Interpret and analyze data problems

    •    Conceive, plan and prioritize data projects

    •    Build analytic systems

    •    Visualize data and create reports

    •    Experiment with new models and techniques

    •    Align data projects with organizational goals

    You are skilled in:

    •    Apache Kafka,

    •    Apache Spark,

    •    BigQuery,

    •    Elasticsearch

    •    or similar technologies

    You should have a strong problem-solving ability and a knack for statistical analysis. If you are also able to align our data products with our business goals, we would like to meet you.

    Your benefits:

    •    working full-time remotely

    •    trips to team meet ups are paid

    •    teammates from countries all over the world

    For further information about our remote work culture visit us on

    Our mission

    SecurityTrails strives to make the biggest treasure-trove of cyber intelligence data readily available in an instant. We work relentlessly to empower experts so they can thwart future attacks with up-to-date data, proprietary tools, and custom solutions.

    A Security Beast Built Bit By Bit

    We started because we were tired. Tired of combing through domain lists and forensic data manually, tired of searching through numerous sites for all the data we needed. We patiently waited for the perfect tool, but it never came. Our solution had to be vast, fast, and able to update daily — so we assembled a talented team and built it from scratch.

    SecurityTrails was founded in June 2017, and from the very start it was decided that it would be a fully remote team. What began as a team of three people based in the US has grown into a team that currently counts 19 individuals. And that’s not even taking into account the number of contractors we work with, living and working across the entire globe.