We use a highly curated database to find the candidate with the skills most suited to your business’s needs. Browse the talent below, find the perfect fit and our qualified customer service agents will take care of the rest.
Sr. Data Engineer(Hadoop/Spark/Kafka/Cassandra/Scala/Java)
9.5 years of product development experience in various domain like banking, E-commerce, Telecom Expense Management, API Management on AWS cloud and IOT. 5 years of Rich experience in Designing and developing Big Data processing platform that includes ETL application using Cloudera Hadoop Distribution (CDH), Real-time data processing and analysis using Apache Spark Streaming, IOT data analysis using Storm. Experience in building data processing and analytics platform using various open source technologies. Experience in agile software development process and lifecycle. Expertise in Hadoop Map Reduce, Hive, Oozie, ZooKeeper, Flume, Spark, Storm, Kafka, Cassandra and Elastic search. 5 years of experience in various AWS services like EC2, S3, RDS, ELB, Kinesis etc.
• 12+ years experience in deriving pragmatic, actionable insight from data, • Advanced analytics, Predictive modelling, Data Visualisation, • Process Automation, Machine Learning, • Locating bottlenecks and optimising business operations, • Strong background in the engineering research and development, • Data lifecycle management (ETL) • Product Optimisation, Design of Experiment using Simulation Tools https://www.linkedin.com/in/kprzysowa/
Currently a freelance WordPress and front-end web developer, with a good knowledge of PHP as well. I have a professional attitude and work hard to provide what the client wants, in time and on budget.
Reliability, stability and security are at the core of SPARK. With its foundations in Ada, this programming language is commonly used in high spec projects which call for ultra-low defect software. There is increasing demand for these types of applications, in all areas of industry. From military and cryptographic applications, to software for the aviation and transportation industries. SPARK is a widely respected piece of kit for such requirements, with over two decades of development to draw on. In an ever-more security- and stability-conscious world, the demand for expertise in SPARK continues to grow.
Java is considered by many a programmer and developer as a “bread and butter” language. This widely known, multi-paradigm language works in unison with CSS and HTML to form the basis of the World Wide Web we’re ever-more dependent on today! Incredibly widely used, Java has earned its popularity through its adaptability and (relative) simplicity. Many users are pleased to find that achieving the basics with Java is relatively simple. Once expertise are gained, however, there’s a huge amount that can be done with Java in an almost endless range of settings. Often preferred to alternatives such as C++, the security and effort-saving features of Java are also worth noting.
There are many languages and tools which can be used to work with big data. The key challenge when it comes to with working with big data lies in the name. Big data can be vast, which makes it especially tough to collect, store, manipulate and analyse; whether youre generating insights or turning data into interactive graphics. At the top of the big data programming tree youll find R, SAS, Python, SQL, Scala and (to a lesser extent) Java. R is likely the most powerful, however, as it is more of a statisticians language, programmers from other backgrounds can sometimes find getting to grips with the language more difficult. R is also a highly specialised language, which is far less useful as a general purpose language. Widely- and flexibly-used Python is perhaps a more practical choice for developers looking to work with big data.
Free and open source, MongoDB is a NoSQL database programme which uses JSON style documents to store data. The document-oriented framework behind MongoDB allows users to use fewer schema. There are many advantages to this type of database programme, including the lack of complex joins, the clear structure of single objects, the scalability (especially easy with MongoDB), the ability to index on any attribute and the programme’s support of deep, dynamic queries. Accessing data via MongoDB is often faster than other alternatives thanks to the programme’s use of internal memory for storing the working set.
Couldn’t find what are you looking for?
How can we help you? Insert your email address and leave a message.