Careers

We are Computewell

Destination for all IT Solutions

Our consultants combine creative problem solving, innovative technologies and proven implementation skills to deliver solutions that can give our clients' business a competitive edge.

We provides IT consultancy, bespoke application development, project management and technical resourcing services to clients across diverse markets.

If you are interested working with us, below are the current openings we have, so please apply by sending your cv at careers@computewell.co by mentioning the job title in the subject of the email.

Join the team...

We are currently hiring for these positions:

IoT -  Software Developer
Industry Experience: 2 Years +

Desired Skills:

• Apache Kafka, Spark Streaming and EMS messaging suite. • Hbase, Cassandra, Core Spark • Geneos, Kibana with Logstash and beats. • MongoDB, Apache CouchDB and Elasticsearch nosql data stores alongside Oracle RAC and Microsoft SQL Server Cluster. • Python, Perl, Angular2, Flask, Shell. • Having some knowledge of Java framework is beneficial. • Not essential to have knowledge of all the above listed tools and technologies, but the more the better.

Desired Experience:

• Experience in working with workflow management abstractions like Luigi, AirFlow and Azkaban • Experience in building near real time streaming platform using Kafka, Spark Streaming, Core Spark, HBase, Cassandra and DataStax tools. • Experience working with multiple types of sensors, and data gateways. • Preferred to have working experience with Microsoft IoT tools (not mandatory). • If you have similar experience i.e. you have experience in above areas but using different toolsets, that's fine as well.

Qualifications:

Bachelor Degree (preferred in any Science subject) or Bachelor degree in Computer Science or Computer Applications (BCA) or A Levels with Maths & Computing subjects with some Certificate or Diploma in any Programming Languages.

Duties:

• To develop the IoT integrations as per project delivery requirements. • Developing live and batch data streaming services for IoT devices. • To work with Business Analysts, Technical Architects & Solution Designers to understand the detail of the designed solutions and develop the requirements using different programming languages and tools. • To liaise with Project Manager and PMO team and follow the task timelines and report progress and delays. • Highlight any risk and issues to development completion timelines to PMO team. • To work with co-team members and developers and support System Integration Testing. • Unit Testing the Code fixing the defects identified during SIT and UAT and other testing phases.


DevOps -  Programmer
Industry Experience: 2 Year+

Desired Skills:

• Apache httpd, Apache Tomcat, Nginx web servers and Varnish cache. • UNIX (RHEL/AIX/HP-UX/Solaris) and Windows based environments (Windows 2000/2003/2008 and Windows 2010.) • Software tools: Matlab, Mentor Graphics, Opnet, Elasticsearch. • Programming Languages: Python Django, Flask and other similar scripting tools • Open Source and other no-SQL / SQL DBs: MongoDB, CouchDB, RDBMS SQL, Oracle, MS-SQL • Front-end languages: Javascript, Angular2 framework • Kibana and Geneos tools. • Knowledge of LAN & WAN Protocols: HDLC, PPP. • Knowledge of Configuration of VLAN and Switching protocols like VTP and STP. • Not essential to have knowledge of all the above listed programming languages and technologies, but the more the better.

Desired Experience:

• Experience in developing and supporting the project environments. • Experience of CI / CD processes using multiple deployment toolsets. • Experience of schema design to administration in MongoDB, Apache CouchDB, Elasticsearch, Oracle and MS SQL Server. • Development experience in Javascript, Angular2 frontend framework, Python with Flask web backend microframework, experience with Python Django. • Expertise in shell scripting, Python and Perl programming. • Good knowledge of capacity planning and performance tuning. • Experience using Kibana and Geneos for platform and project dashboards. • Installation, deployment, hot-fix, patching, release and configuration management experience. • Better if experienced in the design, implementation and administration of reliable, low latency and efficient messaging solutions (not mandatory) • If you have similar experience i.e. you have experience in above areas but using different toolsets, that's fine as well.

Qualifications:

Bachelor Degree (preferred in Computer science or Networking) or A Levels with Computing with some Certificate or Diploma in Programming Languages or IT Toolsets listed above.

Duties:

• To develop the required platform environments as per project delivery requirements. • Help designing the code merging working environment, and CI / CD structures for deployments of projects. • To work with peer programmers, software developers & Solution Designers to understand the detail of the project requirement and proposed solutions and accordingly develop the integration, environment, and continuous deployment framework using different programming languages and tools e.g., Jenkins etc. • To liaise with Project Manager and PMO team and follow the task and timelines and report progress and delays. • Highlight any risk and issues to deployment completion timelines to PMO team. • To work with co-team members and developers and support Testing phases, as multiple test runs may be needed to execute in parallel, so support that.


Big Data -  Software Developer
Industry Experience: 2 Years +

Desired Skills:

• Big Data - Hadoop, MapReduce, HDFS, Hive, Java (jdk1.6), Hadoop distributions of Horton Works or Cloudera or better to have latest merged frameworks experience. • MapR, Pig, Hive, Python, Sqoop, Spark, Mesos, luigi, Azkaban, Yarn • Kafka, Oozie, Zookeeper, Gangilia • AWS Suite - EC2, EMR, Redshift and similar tools • ETL Tools - Business Objects Data Integrator r2/r3.2, Data Quality, Data Insight, Data Federator, Universe Data Cleanse (UDC) • Databases - Oracle 9i/10g, SQL Server 2005/2008 (SSIS,SSAS, SSRS) • No SQL Databases including Open Source DBs. Mongo DB or Graph QL etc. • Not essential to know all the above listed tools and technologies, but preference to know most or equivalent / similar tools.

Desired Experience:

• Working experience building, designing, configuring medium to large scale Hadoop environments. • Excellent understanding / knowledge of Hadoop architecture and various components such as HDFS, Big Data eco system Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm • Hands on experience in installing, configuring, and using Hadoop ecosystem components like Hadoop Map Reduce, HDFS, HBase, Oozie, Hive, Sqoop, Spark, Kafka, Zookeeper, Yarn • Experienced in monitoring Hadoop cluster environment using Ganglia • Experienced on working with Big Data, Spark, Scala and Hadoop File System (HDFS). • Experienced on working with different Big Data variants like on Cloud (AWS, Azure), or on premises (Native, Cloudera and Hortonworks) • Strong knowledge of Hadoop and Hive and Hive’s analytical functions. • If you have similar experience i.e. you have experience in above areas but using different toolsets, that's fine as well.

Qualifications:

Bachelor Degree (preferred in any Science) or Bachelor degree in Computer Science or Computer Applications (BCA) or A Levels with Computing with some Certificate or Diploma courses in Programming Languages or Databases designing or relevant to above required skills)

Duties:

• To develop on Big Data technologies as per project delivery requirements. • Developing data ingestions and data structuring for optimised storage and bulk processing of data. • Importing and exporting data into HDFS and Hive using Sqoop and other similar tools and utilities e.g., Microsoft Azure Data Factory etc. • Load and transform large sets of structured, semi structured and unstructured data sets. • To work with Business Analysts, Technical Architects & Solution Designers to understand the detail of the designed solutions and develop the requirements using different programming languages and tools. • To liaise with Project Manager and PMO team and follow the task timelines and report progress and delays. • Highlight any risk and issues to development completion timelines to PMO team. • To work with co-team members and developers and support System Integration Testing. • Unit Testing the Code fixing the defects identified during SIT and UAT and other testing phases.

Data -  Programmer
Industry Experience: 2 Year+

Desired Skills:

• Modelling Tools - Erwin, Power Designer, MS Visio • Scripting Languages - PL/SQL, T-SQL, R, Power Query, DAX, MDX • Business Objects (OLAP): Business Object, BO Supervisor, Designer, Reporter and Web Intelligence • MapReduce, Pig & Python • ETL Tools – BODS, Informatica SQL Analyzer – SQL, PL-SQL • Database Technologies - Oracle, DB2, Netezza, SQL Server, HDFS, HBase, Hive, Impala • Algorithms Development / Problem Solving and Data Flow Resolutions using Programming languages like Python database scripting languages.

Desired Experience:

• Experience in data Processing with also experience in designing and implementing Data Mart applications, mainly transformation processes using ETL tools. • Experience of working on medium to large size data warehousing development or data reporting and MI life cycle projects. • Experience of working on data migration projects, or been involved in supporting data side of the migration’s life cycles. • Some experience with ETL methodology for designing and developing extraction, transformation and load processes in dynamic, high-volume data environments. • Experience of modelling of Databases for data warehouse implementing Star Schema and Snow Flake Schema. • Experience in code specifications and OLAP Analysis and Design. • Experience of using Business Object and creation of Universes, creation of reports and Info views • Knowledge and experience of SSIS / SSRS / SSAS • Experience of solving technical problems related to data. • Experience with data profiling and data classification exercise • Experience in Data cleanse approach and tools, automation of the manual data cleanse logics. • Experience of analysing data sets to determine optimal way to aggregate and report on it.

Qualifications:

Bachelor Degree (preferred in Computer Science or Mathematics or Statistics) or A Levels with Mathematics with a Certificate or Diploma in Structures Programming Languages or Relational or Non-SQL Databases.

Duties:

• Work on data centric projects, like Data Migrations or Data Cleanse Projects. • To automate the data cleanse routines or to automate the data generation as required for projects during data migrations, or testing and performance testing scenarios. • To work with Architects & Solution Designers to understand the detail of the designed solutions and develop the data requirements using different programming languages and tools. • Working on data obfuscation and data de-duping as per project requirements. • To work closely with Data scientists and Data analysts for development of the data sets they need for their research and investigations. • Assist Data science team with development of solution logics and algorithms for given data problems. • Using programming scripting and tools, automating and reducing the manual effort of data entry process and working on data loads. • To liaise with Project Manager and PMO team and follow the task timelines and report progress and delays. • Highlight any risk and issues to development completion timelines to PMO team. • To work with co-team members and developers and support System Integration Testing. • Unit Testing the Code fixing the data-oriented defects identified during SIT and UAT and other testing phases. (important one)

Note: Candidates who doesn't have the 'right to work in UK', because their right to work status has changed due to Brexit Rules from 1st Jan 2021 or candidates residents of other countries, which need UK work-permits to work in UK can also apply, as workpermit sponsorship will be considered for the right candidates.

Copyright © Computewell 2018