Data Engineer | Logic20/20

CORE Role

Data Engineer

Logic20/20 is looking for seasoned Data Engineers to join our Seattle team. The primary responsibilities include cloud platform pipeline construction, automation of processes, and Directing the initiatives for the creation of key data sets.
Data Engineers will work heavily in databases, employing scaling and automation while performing routine data analyses. Insights gleaned from these processes will be presented to a range of stakeholders within the analytics and business communities, so candidates should feel confident representing their work and our brand as a subject matter expert. Data Engineers will continually seek to improve our methods and tools, researching new innovations, driving the implementation of new techniques, and mentoring other team members to do the same.
Candidates should have experience with engineering client solutions centered around Azure Data Lake and Spark. You should have knowledge of Python, and a minimum of eight years of SQL experience. All cloud platforms are welcome!
 

See the full job description and apply online.
Interesting technologies

Work with Spring Boot, Spring Cloud, Cassandra, Kafka, and other technologies.

Collaboration & mentorship

Work with a variety of teams internally and foster mentoring relationships

Varied challenges

Projects bring unique and interesting challenges; architects solve one problem and move on to the next.

Thought leadership

Lead the field in IoT, server-less, containers, big data, machine learning, and AWS cloud computing.