You’ll have the power to go beyond – doing the work that’s transforming how people, businesses and things connect with each other. Not only do we provide the fastest and most reliable network for our customers, but we were first to 5G - a quantum leap in connectivity. Our connected solutions are making communities stronger and enabling energy efficiency. Here, you’ll have the ability to make an impact and create positive change. Whether you think in code, words, pictures or numbers, join our team of the best and brightest. We offer great pay, amazing benefits and opportunity to learn and grow in every role. Together we’ll go far.
What you’ll be doing...
You will be responsible for software application development for new and ongoing telecommunications and data communications products. These efforts will be aimed at customer retention and revenue expansion.
Design, develop, and test custom software systems for telecommunications and data communications applications.
Analyze customer requirements and develop concepts for new applications.
Effectively work in an inter-disciplinary team environment.
Coordinate with project management, software architects, and other engineering teams in determining overall system solutions.
Support the scoping and implementation of technical solutions: estimate, prioritize, and coordinate development activities.
Build systems using Big Data, microservices, and distributed computing technologies.
Train and mentor other developers in Big Data technologies as a SME.
Author technical documentation as needed.
Support QA team in developing test plans.
What we’re looking for...
You'll need to have:
Bachelor's degree or four or more years of work experience
Six or more years of relevant work experience
Demonstrated work experience in the following: Big Data and distributed programming models and technologies (such as Hadoop, Spark), Hadoop Distributed File System (HDFS), Distributed Indexing and Databases (SOLR, ElasticSearch, HBase, Hive, Cassandra, Vertica), Serialization Formats (JSON, Avro, Parquet).
Knowledge of database structures, theories, principles and practices (both SQL and NoSQL).
Experience implementing ETL/data pipelines using Apache NiFi and associated tools and technologies.
Eligibility to pass an extensive background investigation as a condition of employment.
Even better if you have:
Bachelor's degree in Computer Science or 10 or more years of professional software development experience.
Master's degree in Computer Science or relevant technology field.
Seven or more years of relevant work experience developing Big Data applications in LINUX environments.
Experience with messaging technologies like Kafka, Spark Streaming, etc.
Experience in data science, statistical analysis, sampling & modeling.
Experience developing with scripting languages such as Python or Perl.
Familiarity with DevOps and CI/CD tools for automation of build, packaging, deployment, and testing.
Experience with Atlassian’s agile development tools including Bitbucket, Jira and Confluence.
Knowledge of information security and data protection best practices.
Knowledge of networking technology for telecommunications and data communications.
Excellent communication and analytical skills.
Equal Employment Opportunity
We're proud to be an equal opportunity employer - and celebrate our employees' differences, including race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, and Veteran status. Different makes us better.