Data Mining and Machine Learning Engineer. PhD. Have experience with Hadoop, Hive, Mahout, R, Weka, development experience with Java.
Pentaho Certified developer with 92% score. 4+ years of hands-on experience in Pentaho, JasperSoft, Data Warehousing and Data Analytics and Dashboards creation. Expert in writing Hadoop MapReduce jobs. Extensively worked with databases like MySQL, SQL Server, MongoDB and PostgreSQL. Certified Java Developer with 6+ Years - Struts/Spring/Hibernate framework, JDBC, JSP, HTML, JBOSS, Tomcat, Restful API.
I have 10+ years of experience in designing and developing enterprise solutions in Big Data and Mobile technologies. I am certified Big Data developer/Consultant with additional certifications on Mobile application development(Android/iOS) and Java Webservices. I work in SME pool for hadoop technologies and have deployed clusters in AWS, Rackspce and within private client network. I have also developed Analytics models and statistical regressions in Mapreduce and Spark for data processing. I have setup multiple big data clusters in Amazon and in-house using Cloudera and Hortenworks distribution . I also have written blogs and contents about Big data. I also have 35 android apps published in google play store and amazon app center and many iOS apps created for different clients. More details about me and my white-papers and implementations can be found at http://ngvtech.in
Having 15+ years of experience working with reputed multinational companies in India and US. Involved in design/developing /troubleshooting large corporate applications. My expertise is in Cluster configuration, Hadoop, Hyper V , Storage Management, Storage visualization. Also worked on eCommerce web portal development , performance optimization.
Â 15+ years of seasoned experience in SDLC (Software Development Life Cycle). Â USA Citizen with Active MBI (Minimum Background Investigation) Clearance. Â 3 years of experience in Big Data technologies (Hadoop HDFS/Map-Reduce, Pig, Hive, Flume, Sqoop, Oozie). Â Good understanding of NoSQL databases: HBase, Cassandra, MongoDB. Â Proven expertise in architecting and implementing software solutions using Big Data, ESB, BPM and SOA. Â Experienced in both Waterfall and Agile Development (SCRUM) methodologies. Â Domain experience in Federal Government, Consumer, Pharmaceutical, Financial, Insurance, Medical Devices, Telecommunications and High-Tech industries. Â Excellent analytical and communication skills.
Extensive 5 years of experience in IT, Total 2.4 years of Experience in Big Data & Cloud computing on Amazon. Migrated Projects such as Cobol Programs running on Mainframe , DB2 queries , Oracle Queries and Sql long running queries into Hadoop distributed Environment and so many POCs .
I have successfully completed DataBricks|Oreilly Apache Spark Certification Technologies: ------------------------------------------- Java, Scala, Basic Python BigData: Hadoop & Hadoop Ecosystem, Apache Spark and its Ecosystem, Storm NoSQL: HBase, Cassandra, MongoDB, Neo4j, SQL data bases , Hive, Pig, PrestoDB, Impala, SparkQL Search : Hsearch, Apache Blur, Lucene, Elastic Search, Nutch Other: Oozie, Apace Kafka, RabbitMQ, Extjs Cloud: ------------------------------------------- Amazon Web Services: EC2, Amazon Elastic Map Reduce, Amazon S3 Google Cloud Platform : Big Query, App Engine,Compute Engine Rack Space : CDN, Servers, Storage Linode Manager Data: ------------------------------------------- E-Commerce Social Media Logs and click events data Next Generation Genomic Data Oil & Gas Health Care Travel
Hadoop / PIG / HIVE / Oozie / Java / J2EE developer and Big Data enthusiast. About 2 years of Hadoop experience and about 7 years of Java experience. -- I have good experience in Hadoop ecosystem -- I have good knowledge in Pig, Hive, Flume, Oozie -- I have good knowledge in Elasticsearch -- I have good knowledge and experience in Spring Framework and Hibernate
3 year's experience in Hadoop ,Android,Java and development. Working knowledge in Hadoop, this includes experience in Installation, Development and Implementation of Hadoop. Experience in writing a Pig Scripts to processing and analyzing the data. Working knowledge in Android and Java. Experience in installing, configuring and administrating the Hadoop Cluster of Major Hadoop Distributions like Apache Hadoop and Cloudera. Experience in working with different operating systems Windows XP, UNIX, and LINUX. Developed applications for distributed environment using Hadoop, Map Reduce and Java.
I have done my BE in Information science. Currently I am working for Edureka as a Research Analyst, I claim to be a Big Data expert. I have been working on HADOOP for 3+ years. I have also worked on Spark and Storm.
Hadoop and Big Data fan. I am working as Bigdata Architect. I offer end to end Bigdata integration skills. My linkedin profile is http://au.linkedin.com/in/jagatsingh If you search my name ( Jagat Singh + Hadoop ) on Google , you can find more about work i do ( or have done ) in hadoop world Having experience in Map Reduce , Pig , HIVE , Sqoop , Oozie , Storm , Mahout , Rhadoop , R , Cascading I hold the following certifications * Cloudera certified Hadoop Developer * Cloudera certified Hadoop Adminstrator * IBM certified Cloud solution provider * Sun Certified Java Programmer * ITIL certified for managing IT services I offer complete end to end solution design / architecture and deployment skills. I work for only one project at a time , so before confirming i would share with you what my workload is and when i can start for you. I am in Melbourne timezone currently. Happy to talk further with you. Thanks for reading.
Hadoop developer, hadoop administrator, Oracle database administrator - these are my core capabilities. I offer end to end solution of the proposed work package within stipulated time along with best quality of code.
I have 10 Years of Experience in Software and Web development. I have good experience in design and development. About my work, I have worked on various kinds of projects like Travel domain, Social Media Monitoring, Search technology, Financial and web site development using Java-J2EE, Hadoop, Hbase, Map Reduce, HBase, HIve, MongoDB, Lucene, Spring, Hibernate,Struts, SOAP and Restful Web services and MySQL.
I started working on in-house data integration projects between disparate source systems for a pharmaceutical company back in Peru, then moved to build a cloud-hosted data warehousing solution using SQL Server and Java while working for an Australian software company. I have been lately trying out several big data use cases, and working on some sandbox projects so I am really interested in using Hadoop technologies in order to tackle those big data challenges that companies face nowadays.
I have six years of experience in developing java J2ee related technologies and relatively two years of experience in Hadoop. My expertise are: Senior technologist with strong business and technical experience in Big Data space. Results oriented, decisive leader in Big Data space that combines an entrepreneurial spirit with corporate-refined execution in Big Data strategy, Big Data consulting, implementations and architecture. Areas of expertise include: ~ Hadoop Administration, development ~ NoSQL and real time analytics ~ Big Data/Hadoop solution Architecture, Design & Development ~ Social Media, Community & Viral Marketing ~ Business Development, Global Scale ~ Big Data strategy
I have good exposure to different technologies right from core and advance Java upto Hadoop and tableau reporting. My aim is to work on tougher tasks and deliver the best to the clients mean while getting grip over the edge technologies
We are very Enthusiastic in Bigdata Technologies like Hadoop, Mapreduce, hive , HBase, Sqoop, Oozie, Scala, and No SQl Data Base Like MondoDB Etc., We are Proficient in Java and Can Develop Complex MapReduce Applications As per the Client's Requirement to Handle Large Volumes of Data Sets. Capable to Setting up Cluster for Parallel Distibuted Computing in Hadoop and Can Install all the Required softwares for setup. In one Word we can Handle from scratch to Product including setting up Cluster, Installation of Components in Hadoop EcoSystems and Developing Mapreduce, Hive related Applications in Integration with Components like bHBase,Sqoop and OOzieetc.
i have over 3 years of experience as Hadoop developer, expert in the following areas: 1) Installing and configuring Hadoop cluster. 2)) Importing Data from SQL to HDFS, using Sqoop. 3) Writing queries in hive to load and process data in HDFS. 4) Writing MapReduce programs in PIG. 5) Knowledge of Hadoop, Hive analytical function. 6) exporting data from HDFS to Impala.
SUMMARY Â Overall 10+ years of experience in Data Analytics and 4+ years of building Big Data data-intense applications and products using open source frameworks like Hadoop, PIG, HIVE, HBase, Sqoop, Spark, Scala, Mahout, R Software Â Worked closely with marketing, risk management and operations teams to identify business problems and designed Big Data analytical solutions Â Experience in Designing, developing, tracking and measuring the digital marketing campaigns performance based on A/B testing methodology and communicating insights to Business Users and partners Â Developed predictive models, targeting and segmentation models based on consumer behavior data, transactional data, marketing campaigns data and other historical and 3rd party data elements
Professional with 13 years experience in C/C++,Java,J2EE, Oracle, Data Mining, Database Tuning and other database and web technologies. Have deisnged and implemented J2EE applications with Spring Framework. Spring Batch, Web and Integration frameworks have been extensively used. Experience in multiple domains and also have good business skills. Extensively worked on analytics applications with Hadoop, Cassandra and MongoDB.
We are having Skilled Data Scientists & Hadoop Developers with excellent programming skills (Apache Hive, Apache Pig, HBase, MongoDB, Cassandra, Spark, Flume, Sqoop, SQL, Map Reduce, R, Mahout, Tableau, Pentaho, Java, JSP,MS Excel.
I have 3 years experience in java and from past 1.5 years i am working in Big Data technologies like Hadoop Mapreduce , Apache Spark , Elasticsearch , GIS Big Data , Cassandra , Apache Storm etc. I am currently working as Software Engineer (Team Lead) at Platalytics Inc(Democratizing big data analytics). My job includes writing Mapreduce and Spark jobs for data processing, filtering and analytics. storing , fetching data , query optimization in Big Data Databases providing Big Data solutions to clients. I have developed and deployed multiple big data solutions. I understand the requirements of clients very well and propose optimized solution for the problems.
We are experts in development BigData applications, Web-applications, Mobile applications. Our expertise includes Development applications with full stack delivery(front-end, back-end) from the scratch. Preparing learning materials (Courseware) for BigData and Mobile technologies. Our focus BigData, Mobile, Web development Mobile: iOS (iPhone/iPad), Android, BlackBerry, Windows Phone; ? Cross-platform: PhoneGap, Appcelerator Titanium; Web: PHP, LAMP, yii, ASP.NET, Java, J2EE, WordPress, Joomla!, Magento BigData: ? Hadoop, Hortonworks, Cloudera, MongoDB, Casandra, Python, R-languange ? Testing and quality assurance ? Project management and requirements analysis Our main advantages: quick responses, on-time-delivery, quality work and delivery according to the estimated costs.
We provide solutions for big data problems using Apache Hadoop, Apache Spark, Hbase, Storm, Hive, Spark SQL, Kafka, Oozie, Cassandra, Solr, Lucene, Lily and other tools in Hadoop echo system. We also specialise in machine learning using Mahout, Mlib, GraphX and R.
After spending a decade working for MNCs on enterprise level products, now I want to utilize my expertise at my own.
Over the last 4 years, I have worked extensively on Java/J2ee/BigData projects, I have experience in building small beginner level application to enterprise level distributed application. Hands on experience in : Core Java, Hadoop Map Reduce, HBase Extensive experience on Design & Development of ETL tools, Network performance Monitoring related products, Log Parsers from scratch to production level products. Specialties:Core Java, Hadoop ECO System, buildinng ETL tools. Ides: Eclipse, NetBeans. Familiar with Core Design patterns and Big Data Design Patterns
To bring together the knowledge of Networking, programming, Cloud and Open-source tools. Extensive knowledge of various networking protocols of TCP/IP. Worked extensively on Ubuntu, Backtrack, Fedora, CentOS. Implementing Map-Reduce framework in Java using Hadoop on multi-cluster nodes. Specialties: Network Security: Network intrusion and detection.Metasploit, nmap, netcat, HackerDefender, Rootkits Languages :C,Java, SQL, HTML, XML Scripting: Perl, Bash, Python. Other : Hadoop, Map-reduce, Hive, Pig, HBase, Apache Maven,Oozie, sqoop, Apache Ant, ETL process, Apache httpClient, JBOSS, tomcat, openTDSB Tools:Eclipse,Netbeans, Karmapshere Business Intelligence:MicroStrategy
Vivek is software Hadoop Consultant. He has been working as Hadoop Developer and Performance Architect in the past, He is certified trainer from Cloudera on Hadoop and currently associated with SoApt Software Solutions. He currently writes for SoAPT www.soapt.com/blog
Senior technologist with strong business acumen and technical experience in Big Data space. Results oriented, decisive leader in Big Data space that combines an entrepreneurial spirit with corporate-refined execution in Big Data strategy, Big Data consulting, implementations, CoE setup, architecture, pre-sales and revenue optimization. Adept strategic partnerships builder across technical and global businesses for Fortune 500 customers. Areas of expertise include: ~ Hadoop Administration, development ~ NoSQL and real time analytics ~ Big Data/Hadoop solution Architecture, Design & Development ~ Social Media, Community & Viral Marketing ~ Business Development, Global Scale ~ Big Data strategy ~Hadoop Training Specialties: Business Strategy Consulting, Planning & Operations in Big Data space, Big Data Architecture, Big Data/Hadoop implementations, Hadoop Administrator, NoSQL/HBase/Cassandra, EDW
Having 10+ years experience in Java and BigData-Hadoop, Java, Struts, Spring
Just a brief about me. I am a software consultant with more then 14+ years of exp in various Java/JEE based technologies. In the past I have worked in United States at various fortune 500 companies for 4+ years on H1B. Last, I was working for DELL Services India as an Individual Contributor for more then 4.5 years and left DELL in May 2013. I am looking for a similar kind of Consulting role. I have provided consultancy to various organizations in United States like Sony Motion Pictures, AAA, GAP Inc., Merrill Lynch, World Bank, General Motors, Nissan, Prudential, IEEE society, Deutsche Telecom (at Germany) and DELL Services in India. I have good exposure on following domains Finance, Investment Banking, Insurance (Auto, Home, Life), Retail, Entertainment (Motion Pictures), Automobile, Telecom and Healthcare. He is also Big Data & Hadoop certified with Grade ÂAÂ from a well renowned institution - Edureka. Please let me know and I can provide my detailed resume for f
With having more than 12 years of industry experience, worked on Oracle PL/SQL, Oracle DBA and Informatica Power Center. Involved and led team from start to end. Also have done POCs on HADOOP for around 2TB of data.
I am having 1+ Year of experience as Hadoop Developer and Admin. Having excellent programming and logic power in Big data analysis and managing unstructured data. Big boost knowledge in PIG, HIVE, MapRecude, HBASE , Sqoop,Spark, Java, Unix Scripting
I have 3.5+ years of Hadoop + Data engineering. I have worked on variety of different technologies, spanning from Hive ODBC Driver Development to Recommendation engine development for online streaming media services. I deliver very high quality work, able to quickly grasp any new technology. [I'm new on elance.com, I have a 5 star profile on oDesk at https://www.odesk.com/users/~019b5641196dbbc2c3]
Strong Interest and extensive experience in the following areas: * Machine Learning * Big Data * Computer Vision * Web development (frontend and backend) Expert in using libraries and frameworks like: * Apache Hadoop (pig hive impala oozie hbase mahout) * J2EE * ASP.NET * jquery, knockout, d3, bootstrap * opencv Preferred Languages : * Java * C#
Having around 14 years of experience in software industry . Worked on different technologies that includes Hadoop,Java,Database Administration,SAS, QA Automation with perl and shell scripting. Vast experience in handling large teams and succesfully completed projects in Agile environments. Will be providing more details about companies worked and detailed skillset on request.
I am a Hadoop-BigData Developer
Working as Software Developer from past 6.2 years. Currently working with a Leading IT Services Company. Keeping Keen interest in developing Applications, Product development. Working as Hadoop Developer which comprises of MapReduce, Pig, Hive. Have Worked on database development projects using Oracle, MySQL, Java. Working on NoSQL Database - Cassandra
Experienced Java backend developer. Also I have experience with Java frontend frameworks(such as Tapestry, ZKoss) Databases: MySQL, Hadoop, MongoDB, Redis, Couchbase. Test frameworks: JUnit, Mockito, Robot. Have a big experience with Hadoop stack (Pig, Hive, Impala, Spark, Flume, Kafka, HDFS, MapReduce, HBase)
We are a US based consulting group providing secure solutions in big data, including Amazon Web Services (AWS) implementation and Hadoop solutions. Our developers and associates are all US based residents and honors graduates from American universities, assuring our clients of the highest level of expertise as well as clear and concise communication. This guarantees the shortest turnaround time for your project while keeping your data secure in the US. Pier Group Consulting is an AWS Consulting Partner specializing in cloud enablement. We focus on AWS and itÂs products, including Amazon S3, EC2, RDS, EMR and Elastic Beanstalk. W-9 available when required.
I am a seasoned software developer. Currently my skills are in the open source technologies Java, Hadoop, HBase, Map/Reduce, YARN, and other Hadoop ecosystem technolgies, also RESTful web services. I have developed software for both Windows and Unix based platforms. I have worked in both for corporate and non-profit organizations including Novell, Adara Networks, Symantec, and The Church of Jesus Christ of Latter day Saints to name a few. I have always gotten favorable reviews from managers as well as bonus and stock options. I take a very methodical and agile approach to software development and practice test driven development. I am skilled in both C#, Java, and Perl. I am considered a hard worker and a person of integrity and able to design big data systems.
# 10.5 years of experience in design and development of algorithms, enterprise, rich internet and client-server applications. # 1.5 years of experience as Scrum master for Sybase Control Center IQ 3.2.3 and Sybase Central IQ 15.4 releases. # 6 years of experience working in Agile, Scrum, TDD and BDD environment. # Proficient in Scala, Spark(MapReduce / Hadoop), Spark SQL (Shark / Hive), Slick, Akka, Flume, Sqoop, Scalding, Casbah, Java, Multi-threading, Web services (Rest, DWR), Fortify, Veracode, MongoDB, HBase, HDFS, Oracle, MS SQL, Sybase IQ, JSON, JAAS, JCE, JMX, XML, JMock, ScalaTest, ScalaCheck, Specs, JUnit, Jenkins, TeamCity, CruiseControl, SBT, Maven, Ant, Agile and Scrum. # Master of Technology (Information Technology) from IIT Bombay, India; and Bachelor of Engineering (Computer Science & Engineering) from HCET, India. # Highly motivated, quick learner and enthusiastic person with good communication skills.
Extensive experience in Big data Analytics and Machine Learning using Hadoop , R , Python , spark and Mahout. I have exhaustive experience in Machine Learning algorithms majorly around text mining , Predictive-Analytics (Regression, Classification , Ranking etc) . My total industry experience in 6 + years.
Looking for some good opportunity in MongoDB, Hadoop, Pig, Hive. Currently I am working in MNC, having 3.5 years of experience and having potential to deliver project alone.
I am a seasoned IT Testing professional with eight plus years of experience in both hardware and software using the latest testing tools and methodologies. I have carried our projects both in India and the United States at Fortune 500 companies like Xerox. Good Knowledge in Quality Standards like CMM, ISO, Agile process and Rational Unified Process. I have Excellent experience in handling Onsite-Offshore model, managing team of 50 resources and interacting with State and Client. Â 1.6 years of experience in Hadoop ecosystems, HDFS, Big Data, ETL, RDBMS. Â Hands on experience in working with Hive, Pig, Sqoop, Map Reduce. Â Experience in data cleansing and data mining. Â Loaded the dataset into Hive for ETL (Load, Extract, Transform and Load) operation. Â Good Knowledge in HBASE. Â Excellent knowledge in Core Java. Â Excellent experience in handling Complex and large project. Â 2 years of Experience as Project Manager at Onsite.
We are experts in data management. We have 10+ years of experience in ETL, Data Warehousing, Reporting. We are using the following tools. ETL:- Abinitio, Informatica, Talend V 5.5.0, Microsoft DTS Reporting: Tableau, Pentaho, Logi Analytics, Microstrategy DB:- MySQL, SQL Server, Oracle, DB2, Netezza Big Data:- Cloudera Hadoop CDH4 MR1 Salesforce We are all about data, You have the problem and we have the solution for your data.
Expert Full Stack Architect with 14 years experience in J2EE, PHP, Hadoop big data technologies to deliver high performing scalable web and mobile applications PRESENTATION - YUI, GWT, Vaadin, Jquery , EXTJS, Angular JS twitter bootstrap, AJAX, JSP, JSTL, JSF, SWING, CSS3, HTML5, Spring MVC, JS, tiles TECHNOLOGIES - Spring, EJB, servlet, JTS / JTA, Solr Lucene, Rabbit MQ,OXM (Castor, Jibx, Jaxb), ORM ( Hibernate),SOAP / REST Webservices, JMS, JDBC, RMI, JNI, Java Mail, Quartz for scheduling DATABASES - MYSQL, ORACLE, SQL SERVERS - Weblogic , Tomcat, Jboss PLATFORMS - Windows, UNIX, Linux, Centos, Redhat IDE - Eclipse, NetBeans BUILD - Ant,Maven,Graddle METHODOLOGY - Agilo, Trac, Gitlab, Trello VERSION CONTROL - CVS, SVN , GIT TESTING - JUNIT, Eclispe plugins, selenium, Badboy, Own Test Suite REPORTING - Jasper Report , Birt BIG DATA - Hadoop, Mongo DB, Hive, Hbase, Elasticsearch, solr, lucene
More then 22 years of professional expirience, I am VP of Business Development and Engineering in Sibers. My goal is provide maximum additional value for customer by enhancing process and practices with power of Lean/TOC/Agile approach.
I am skilled with latest BigData technology Apache Hadoop and its ecosystems like Pig, Hive, Hbase, Zookeeper etc. I can help setup Apache Hadoop in standalone mode, pseudo distributed mode and fully distributed mode. I can even help you administer Apache Hadoop for efficient use of resources.
Specialized in: Hadoop and HBase cluster setup and configurations. HDFS and HBase programming. We have built a highly scalable and performance Amazon S3 like cloud storage system (API compatible). It has been used and deployed by several enterprises. Please contact us for more details.
I'm a software engineer, tester, DevOps, Linux/Windows/Mac administrator, Cloud Security & Provisioning expert. As a developer, my ability to write code in multiple languages (Java, Scala, python, C++, C#) allowed me to be assigned to critical core components of internal software platform for my current company. As DevOps Engineer I build the entire deployment and cloud provisioning story, which requires comprehensive and deep knowledge of an entire software platform with all the modules and components that needs to be packaged, the open source components (Hadoop, Spark, Mesos, Tachyon, Cassandra, Zookeeper, Rabbit, nginx, PostgreSql etc) as well as the company internal components built on top of them. I've become expert in configuring and performance tuning all these systems for optimal results.I'm the go-to person when it comes to configuring the target OS for deployment and development environments (Ubuntu, RedHat, MacOS, Windows), VPC, EC2
NetClick is a next generation technology solutions provider focused on streamlining big data enterprise solutions from cloud computing to Enterprise Social Media. With our intense focus on streamlining the haphazard nature of technology solutions used for any type of business, we provide you a solid array of solutions that improve your business. Our Services : Enterprise Big Data Harnessing Strategy & Solutions Enterprise Cloud Strategy & Solutions Enterprise Mobility Strategy & Solutions Enterprise Social Media Strategy & Solutions Business Intelligence Gathering Strategy & Solutions
DataHolics develops enterprise class Big Data solutions that bring visibility to a treasure trove of data already flowing through the organization.
Like to understand the fundamentals of requests and issues. I believe in work proficiently to provide scalable solutions highly rated in the technical definitions and devote 100% as such working on own products also would like to give prompt response to avoid delay in service.
With more than 14 yrs of experience in wide array of software systems, working with global customers, we believe experience and vision is our key driver. We have extensive experience in Oracle SQL/PL-SQL, Shell Scripting, System analysis and design (Waterfall/Agile), Managing Support functions, Metadata driven software development and ETL sw development using informatica PowerCenter, Pentaho, Talend etc.
Web Developer having 3+ years of experience with web technologies
1. Expert visualization Specialist and Architect - OracleVM, ldom, zones, ZFS, doains. 2. Knowledge on Hadoop installation skills on CentOSVisualization expert on OracleVMExpert in Linux and Solaris 3. Experience knowledge on Data center server/Rack consolidation and Data center cost reduction. 4. Expert Knowledge on VxVm and VXFs. 5. Experience in reducing power and cooling costs for the company. 6. Expert in designing for data center capacity management. 7. Work experience in Large companies like GE, BELLcore, CMC. 8. Hardware experience in Large Oracle servers like E10K, SF6800,M5000, T5-2, 7000 series ZFS appliance.
. 9+ years of exp. in Pentaho, Talend, Informatica ETL, Jasper Server - Reports,Dashboard, Adhoch, Analysis View and Domain Administration, Pentaho PDI (Kettle ETL) and Administration and Qlikview,Crystal Reporting. . Sound knowledge in Report, Jasper Server Administrating, Ad Hock Reporting,Pentaho Administration, ETL & DW Development. . Possess good Knowledge in SQL Queries, ETL, Data-warehouse concepts. . Efficient in Word and Excel to prepare user specific requirement analysis. . Posses good team management skill and has experience in leading a team. . Strong team player and has lead teams at client-site and offshore. . Scaling open source DWH using Cluster & NoSQL concepts (Hadppo, MongoDB) Specialties: Informatic ETL, Business Object Info view, Jasper Server - Reports,Dashboard, Adhoch, Analysis View and Domain Administration, Pentaho PDI (Kettle ETL) and Administration and Qlikview,Crystal Reporting.
QBurst provides consulting and customisation services in Big Data under a broad reporting and analytics portfolio. As an early adopter of Big Data technology, we have the right skill sets to help organisations chalk out their Big Data strategy.
One of my client Motivated me to follow these golden guidelines. 1. Keep in mind that most of your clients want to feel appreciated. 2. Always treat your clients like you want to be treated. 3. The most successful entrepreneurs are those who know how to treat clients. 4. Every client is a potential referral - always remember that. 5. Continue to work on your business savvy and NEVER let money be your motivation (driver). 6. This creates a "lack mentality" and you will always have to scrap for money. 7. Prosperity comes from giving your BEST every single time (even when you don't want to).
I am a Cloudera certified data engineer - looking for interesting work on the side.
10+ years experience as developer, leader and PM position. Deep knowledge about solving problems of Big projects: search engine, data mining, HA system, load balancing, logging system... innovative, open-minded, ability to work in group as well as individually, diligent, problem solving skill
I am working as professional software developer for last 7 years. I have worked as full time employee for IBM and Adobe Systems. I am Oracle Certified Java Programmer.
He is a C++/Java developer at IBM and Baidu since 2008, focus on BigData and High Performance Server. And he was graduated from Jilin University with Master degree, major in Distributed System and Grid.
Vertizone provides BI and Big Data solutions. We have successfully designed and delivered working BI and data warehouse solutions using Pentaho, MySQL, AWS, RDS, Redshift, Hadoop. Consulting Partners with Amazon AWS, Pentaho.
Currenlty working on recommendation engine and text analytics for a ecommerce company . Extensive experience on to BigData Hadoop.Worked on Sentiment analysis,Predictive modelling for healthcare,social media and finance domain. Expertise on R,Tableau,QlikView,Hadoop,Yarn,Octave,Python,Machine learning
Over the Last 5 Years, i have developed India's second largest data center for a Norway Telecom company.Developed database architecture for application(s) built using java and Savvion. I have extensive experience in migration of database,defining backup policy,performance tuning of database(s) and assisting client to develop new application in terms of hosting server and database I have extensive experience in administration of Oracle(9i,10g &11g),Sql Server 2005/2008 and MySQL 5.5. I have extensive experience in Configuration of Amazon Ec2 instances, Amazon RDS and Amazon S3 along with Amazon Cloud Watch. Successfully completed 3 Implementation Project of Amazon AWS and Database Migration from Company Hosted Server to Amazon RDS Experience in Cross DBMS Migration(i.e Migration of Oracle to Sql Server etc)
Passionate programmer with huge experience in developing web applications. I have over 14 years of experience with information technology solutions. I am a WordPress (top 5% on Elance), Java and Hadoop developer. I have worked in various segments in the technology world making my experience unique from a lot of other developers. I am a highly motivated, creative, smart working individual. Most of the work that I do today focuses on WordPress and Java development or anything related to web solutions.I can handle WordPress site management, theme customization, plugin installations and general WordPress administration and troubleshooting. I like to analyze and develop solutions for businesses to help them with their business efficiency. I can develop almost any type of software, website, or other business solutions.
We provide solutions for your big data problems. We are experts in providing big data solutions using hadoop and SAP HANA. We have Cloudera certified hadoop administrators and developers. We are a part of SAP HANA start up focus program. We have more than 20 years experience in working on various databases. We help organizations to achieve operational efficiency using our innovative solutions. We excel in finding quick development using Agile Methodologies. Our technical team can churn out solutions quickly using proven agile methodologies without compromising on quality that provide
Â Solid experience concentrated on data extraction, mining, transformation and analysis including big data using multiple sources and API's in SQL, NoSQL, MPP and Cluster based systems for Healthcare, Insurance and Internet/E-Commerce industries. Â Experience in Data Science or Business Intelligence roles including large scale data lake environments. Â Hands on experience in Big Data space (Hadoop Stack like M/R, HDFS, Pig, Hive, HBase, Flume, Sqoop, etc.), NoSQL store such as MongoDB. Parallel relational databases, such as Hawq. Â Implemented Real time analytics and big data processing for the Internet of Things (Healthcare Wearables) with low-latency requirements using Spring XD and Pivotal Big Data Stack. Implemented Social Analytics framework in Cloudera Hadoop Ecosystem Â Has proven abilities in Enterprise data architecture for data integration and management with latest technologies like Teradata v14.x, Informatica 9.x and UNIX shell scripting
I am a Cloudera Certified Administrator and Developer for Apache Hadoop (CCAH & CCDH), working with Big Data, and am seeking companies interested in efficiently and safely managing their rapidly-growing data. I perform my work for a company with a consultant attitude, whether I am a contractor or an employee. I consider the people I work with as my customers. This approach ensures I will always provide high-quality service. In addition, as I fulfill the needs of my customers, I strive to identify and fulfill needs previously unknown to them. This enables me to deliver a product or service that exceeds their expectations. I am very experienced in working with projects remotely. I enjoy automating complex and repetitive processes, to eliminate human error, and enable personnel to focus on higher-level responsibilities. Allow me to assist you in the success of your next project.
Reliable skilled provider of data science, data visualization solutions Main expertise areas are data science (Python, R, Spark, Hadoop), machine learning, D3.js visualizations, dashboards design, one-page web apps (Flask, Django, Angular).
I have knowledge in Hadoop and Core Java and currently working as a Hadoop Trainer. I am looking for more and new projects related on hadoop to work on and provide good service to my clients at a low cost and submit the job on time.
Expert in Datawarehousing solutions
Big Data Engineer with more than 2 years real-world experiences in implementing Big Data Solutions in various industries such as digital media, telecommunication company, and banking. Technologies used such as Apache Hadoop, Apache Hive, Apache Pig, Apache Storm, etc.
Â 10 years of overall professional experience, most of which on Client Handling, Project Execution, Big Data, DataWareHousing, ETL, Reporting and Data Quality Â Very Well Experienced in df Power Studio, Data Management Studio, Hadoop, SAS Programming, SAS DI Studio Â Used df Power Studio and Data Management Studio extensively in Data Quality Projects Â Used languages like PIG and HIVE in Big Data Implementation Â Prepared ETL jobs using SAS DI Studio Â Worked on DataWarehouse Â Developed reports in MS Excel Â Experienced in writing SQL/PL SQL in Oracle Â Implemented model to reduce cost and increase profit Â Team Leading and Project Management Experience
Having worked with Bene toys Nigeria limited as an administrative and IT officer for well over 5 years, i think i am in a very good position to contribute to the success of your business. The attainment of the goal and objectives of my boss/employer is our collective aspiration. I require minimum supervision in my job and hitting the target time is my priority, my communication level is very high. By employing me, be rest assured that your job is in the right hand and will be delivered ahead of time with utmost accuracy. Thank you
Distributed computing expert on Machine Learning, NLP, Social Media Data (FB/Twitter) processing and Sentiment Analysis using scikit-learn, weka, Hbase, Hive, Python, Lucene, Spark, Tachyon, BDAS, Spark, Hadoop. Architect - Big Data, Real time BI and Data science Solutions using Hybrid Lambda (MPP rDBMS + Distributed NoSQL) architecture. Proven track record in scalability and sub-millisecond response of High frequency Financial Trading (OLTP) applications.
We are a fast growing well funded firm providing End-to-End Enterprise Cloud Computing, Virtualization and Systems Integration services with Strategic Consulting to leverage an organization's IT resources in a way never imagined before. Our approach ensures optimal Return on Investment and Cost Reduction Benefits with minimal overheads and operational costs, all the while maximizing the flexibility and scalability of the infrastructure. Our USP is our ability to analyze and utilize the existing setup and build on top of it thereby effectively integrating legacy physical infrastructure and services with those of the Cloud. Being an independent, unbiased and neutral authority on cloud, we strive to deliver impact and value through your journey of cloud readiness.
With an M.S in computer science and more than 17 years of experience in the IT industry managing products & bootstrapping, mentoring and managing high performance teams. I possess a unique blend of Technical prowess, Product planning & development experience, Deep domain expertise and an insatiable urge for learning new things, which have enabled me in building and developing complex solutions that meet specific needs of my customers.
I have handson experience in working with various hadoop distributions like (Cloudera,HortonWorks,PivotalHD,IBM BigInsights)
We are providing solution to small business for application development , deployment and support. Contact us for Hadoop (Big Data) Consultancy, Training, POC / Project Development / Deployment / Troubleshooting Specialties: Big Data, Hadoop, MapReduce, Hive, HBase, Flume, Pig, Hue, Greenplum, Cloud Computing, Java
I have 2 POCs experience in Hadoop and its eco system. I have experience in Hive, HBase, Sqoop, zookeeper,HDFS and MapReduce concepts.
I am well versed with latest BigData technology Apache Hadoop and its ecosystems like Pig, Hive, Hbase, Zookeeper etc. I can help setup Apache Hadoop in standalone mode, pseudo distributed mode and fully distributed mode. I can even help you administer Apache Hadoop for efficient use of resources. I have working knowledge of Nosql technologies like cassandra/hbase/couchbase/mongodb. I do design the architecture, do capacity planning and resource utilization to give the better performance.
I am a professional consultant with over 18 years of experience in the IT industry. I specialize in the design, architecture, implementation and operation of end-to-end distributed data ingestion, warehousing and analytics solutions.
+ Migrating current infrastructure to Hadoop / HBase or Cassandra + Providing ETL design and architecture on top of existing infrastructure using Hadoop + Establising Workflows on top of Big Data + Integrating latest state of the art Big Data Visualization technologies like Talend, d3. + Help in deciding real-time to near real-time differentiating of online Analytics + Crisis Handling - Disaster Recovery, Namenode troubleshooting, Data loss prevention and recovery on HDFS + Training in the area of Big Data Technologies Regards, CTIIT.in
If you need a high quality and accurate work. contact me! I'll be happy to be at you're service, with a good amount of knowledge and have more than 2 year of professional experience in many areas such as Networking, System Administration, Security, Troubleshooting and Hadoop Cluster Deployment. Looking forward in finding different kind of exciting jobs and meeting awesome new people. Thank you!
A one stop solution to all your BI/DW needs! Our mantra is Information, Intelligence & Insight. An incubator of thought-leaders, collaborators, and enablers. We're a team of thinkers and builders who invent smarter, faster, intuitive business applications for companies who want to compete on analytics. Our team has served in strategic and tactical roles in a variety of industries. We can work in diverse industries like Finance, Insurance, Edu, HC, Retail/Ecommerce, Telecom, Banking, Pharmaceutical etc. We leverage our years of experience and astute problem-solving skills, apply the right technology & methodology to ensure that we neither oversimplify problems nor over-engineer solution
SQL & NoSQL Data Architect | ITIL? & PRINCE2? Practitioner | *** Open Network for Database Professionals *** http://www.linkedin.com/in/devendrashirbad -- Devendra is known as ?Dev? among the group. -- He is an individual contributor, highly motivated and results-driven IT Professional with ~12 Years of experience. -- His technical skills are certified by Top Blue Chip IT companies like Microsoft, HP, IBM, Oracle & MongoDB etc. -- He follows AXELOS Global Best Practices for IT & Project Management. He is a certified ITIL? & PRINCE2? Practitioner too. -- He is Ex-Microsoft CSS (Customer Service and Support) for SQL Server 3rd Tier and has specialization in SQL Server Replication, Performance Troubleshooting and Data Recovery Critical Situations (CritSit) on PROD servers. -- He has substantial experience in Database Architecting, Database Administration, High Availability & Disaster Recovery (HA & DR) Solutions and has nice exposure to Data Warehousing.
I am java hadoop programmer. I am well versed in hadoop, mapreduce, pig, hive technologies. I have also worked in java, jsp, ext js, hibernate, spring frameworks. I am also good at j2ee technologies using ejb, jms, jdbc etc. I have also worked in ETL and datawarehousing Projects. I have worked in TALEND tool also. I am also good at Object Oriented Analysis and Design using UML My passion is coding and writing code that matters.
Architected and implemented multiple data analytics systems using hadoop
Mobile App developer and Hadoop expert
We are a one stop shop for your data apps & analytics platform. We help build, operate and manage cloud agnostic (Amazon, Windows Azure and Rackspace) for your data apps and analytics. We support Tableau, Microsoft BI. IBM Cognos, SQL Server, MySQL, Oracle, Hadoop and Ruby On Rails competencies. We are a Microsoft, IBM, Tableau, Cloudera, Rackspae and Amazon Partner Provide all DevOps for analytics capability necessary to operate infrastructure is no easy task given complex technologies and increasing number of data sources, variety and volume of data. LogicMatter eases that burden for you. To then monitor and manage this infrastructure requires a broad set of capabilities and more importantly a dedicated capacity. http://www.logicmatter.com