I am a Big data Architect, Having 10 yrs of experience implemented multiple data pipelines and complex analtyic solutions using Hadoop Ecosystem ( MR , Yarn,Hive, Pig, Sqoop , Flume) , Nosql solutions like ( Hbase , cAssandra, Couchbase). Implemented real time pipeline using Storm , Kafka. Implementd DW solutions using Spark , Saprk SQL. Extensive knowledge on AWS and Amazon Redshift. Cloudera Certified Developer for Apache Hadoop Cloudera Certified Specialist in Apache HBase Cloudera Certified Administrator for Apache Hadoop. Specialties Big Data, Hadoop, Spark ,Storm , Kafka , Couchbase , Cassandra, HDFS, MapReduce, Pig, Hive, HBase, Sqoop,Crunch,Mahaout Java, JSP, Servlet, Spring, Hibernate, SQL, Ant, Subversion,ExtJS,Android
We are very Enthusiastic in Bigdata Technologies like Hadoop, Mapreduce, hive , HBase, Sqoop, Oozie, Scala, and No SQl Data Base Like MondoDB Etc., We are Proficient in Java and Can Develop Complex MapReduce Applications As per the Client's Requirement to Handle Large Volumes of Data Sets. Capable to Setting up Cluster for Parallel Distibuted Computing in Hadoop and Can Install all the Required softwares for setup. In one Word we can Handle from scratch to Product including setting up Cluster, Installation of Components in Hadoop EcoSystems and Developing Mapreduce, Hive related Applications in Integration with Components like bHBase,Sqoop and OOzieetc.
i have over 3 years of experience as Hadoop developer, expert in the following areas: 1) Installing and configuring Hadoop cluster. 2)) Importing Data from SQL to HDFS, using Sqoop. 3) Writing queries in hive to load and process data in HDFS. 4) Writing MapReduce programs in PIG. 5) Knowledge of Hadoop, Hive analytical function. 6) exporting data from HDFS to Impala.
I am an electronics engineer graduate. I started my career with Java Developer for 2 years and then moved into middleware support technology like IBM Websphere Application Server, Tomcat Apache , IBM MQ. Then I started learning hadoop myself where I learned Hadoop, PIG, Hive, Fume. And now started working on few demo projects related with hadoop.
I am having 1+ Year of experience as Hadoop Developer and Admin. Having excellent programming and logic power in Big data analysis and managing unstructured data. Big boost knowledge in PIG, HIVE, MapRecude, HBASE , Sqoop,Spark, Java, Unix Scripting
I am a Cloudera certified data engineer - looking for interesting work on the side.
QBurst Technologies is a global software services organization with a strong focus on new generation technology platforms. We have business and development centers operating out of USA (Virginia), Europe (London and Poland), Middle East (Dubai), Far East (Singapore) and India (Trivandrum, Cochin and Koratty). Our clients are some of the world?s leading corporations. Our software portfolio includes web & mobile development, user experience services, design services, independent validation & testing and competitive intelligence services. Our global team of 400+ personnel have experience in bringing over 500 projects to successful completion. Our focus on new generation technologies, our global partnerships with Microsoft, Oracle, Amazon & Adobe, and our competencies in Agile methodologies and processes have helped us meet our commitment and responsibility to achieving our goals.
I am a Hadoop-BigData Developer
I have 3.5+ years of Hadoop + Data engineering. I have worked on variety of different technologies, spanning from Hive ODBC Driver Development to Recommendation engine development for online streaming media services. I deliver very high quality work, able to quickly grasp any new technology. [I'm new on elance.com, I have a 5 star profile on oDesk at https://www.odesk.com/users/~019b5641196dbbc2c3]
I am well versed with latest BigData technology Apache Hadoop and its ecosystems like Pig, Hive, Hbase, Zookeeper etc. I can help setup Apache Hadoop in standalone mode, pseudo distributed mode and fully distributed mode. I can even help you administer Apache Hadoop for efficient use of resources. I have working knowledge of Nosql technologies like cassandra/hbase/couchbase/mongodb. I do design the architecture, do capacity planning and resource utilization to give the better performance.
I am skilled with latest BigData technology Apache Hadoop and its ecosystems like Pig, Hive, Hbase, Zookeeper etc. I can help setup Apache Hadoop in standalone mode, pseudo distributed mode and fully distributed mode. I can even help you administer Apache Hadoop for efficient use of resources.
Someshwar Kale has been working on Big data technologies for 3 and half years.He is involved in development of Big Data Solutions. He is actively working in various proofs of concepts (POCs). He has completed Ã¢ÂÂCloudera Certified Developer for HadoopÃ¢Â? (CCD-410) certification. Responsibility: Ã¯ÂÂ§ Worked on Hadoop subprojects like Map Reduce, Hive, Cassandra etc.. Ã¯ÂÂ§ Requirements Study. Ã¯ÂÂ§ Worked on migrating existing applications to Hadoop. Technical Skills: Ã¯ÂÂ§ Algorithms and paradigms: Analysis of Sorting algorithms,Map Reduce design patterns and analysis of algorithms, Hive, Cassandra, Data Structure. Ã¯ÂÂ§ Programming: Core Java Ã¯ÂÂ§ Scripting:Unix Bash Script Ã¯ÂÂ§ Platforms: Linux, Windows Ã¯ÂÂ§ Source Control: SVN Ã¯ÂÂ§ Tools: Sqoop. Ã¯ÂÂ§ NoSQL Language: Hive Ã¯ÂÂ§ NoSQL Databases: Cassandra Ã¯ÂÂ§ Realtime Computation System : Storm Ã¯ÂÂ§ Workflow scheduler: OOZIE, Cron Tab
Hadoop and Big Data fan. I am working as Bigdata Architect. I offer end to end Bigdata integration skills. My linkedin profile is http://au.linkedin.com/in/jagatsingh If you search my name ( Jagat Singh + Hadoop ) on Google , you can find more about work i do ( or have done ) in hadoop world Having experience in Map Reduce , Pig , HIVE , Sqoop , Oozie , Storm , Mahout , Rhadoop , R , Cascading I hold the following certifications * Cloudera certified Hadoop Developer * Cloudera certified Hadoop Adminstrator * IBM certified Cloud solution provider * Sun Certified Java Programmer * ITIL certified for managing IT services I offer complete end to end solution design / architecture and deployment skills. I work for only one project at a time , so before confirming i would share with you what my workload is and when i can start for you. I am in Melbourne timezone currently. Happy to talk further with you. Thanks for reading.
I am a resident of Mumbai and currently employed with Bank Of America Continuum Services Pvt Ltd. In last 6 months, I have obtained training and certification on BigData / HADOOP ecosystem technologies including MapReduce, YARN, Apache Pig, Hive, Sqoop, Zookeeper, Flume, Hbase, Amazon ElasticMapReduce Amazon S3. The training certificate can be viewed/verified at http://edureka.in/verify. After clicking on the link you have to enter Unique ID - 13090702001 Currently I am looking for a project related to HADOOP. I have 10+ years of experience working as a software engineer in AS400 technology. My present work involves developing and maintaining software on iSeries platform for the Mortgage Servicing business. My domain experience includes Mortgage Servicing, Banking, Supply Chain Management and US Customs & Shipping and healthcare.
i have good experience in mapreduce,hive,pig,sqoop,python,java,linux.
I am having 5+ years of experience in Oracle and Hadoop based projects. I am having very good experience in Oracle SQL, Informatica, hadoop, hive, pig, sqoop, toad, linux.
Â 13+ years of experience in IT industry. Â 2+ years experience in BigData Analytics using Hadoop, Map Reduce, HDFS, Hive, Sqoop, Pig and HBase. Â 5+ year experience on Microsoft Business Intelligence (MSBI), Analytics, ETL implementation using SQL Server Integration Services (SSIS). BI implementation using SQL Server Analysis Services (SSAS), SQL Server Reporting Services (SSRS), SharePoint and Office 2010 tools. Â 5+ years experience in project management. Presently working as Technical Architect /Sr.Project Manager and managing a team of 25 members in Applied Materials on Microsoft Technologies, MSBI and Big Data platform. Â Hands-on experience on SQL Server 2005/2008 RDBMS database development. Â Hands-on experience in design and development on.Net and Microsoft SharePoint applications. Â Deep understanding of the Big Data concepts and technologies and solutions.
I have 5 years experience in software design and development with 2 years in Big Data which includes Java, Hadoop, Hbase, Hive, Pig, Oozie and Sqoop. Initially I worked with Mainframe technologies and itÂs my interest towards open source that made me to excel in Big Data. I have also done ETL & reporting with Pentaho. My responsibilities includes designing and developing job workflow in oozie , developing MapReduce modules in Java,Pig,Hive. Monitoring the jobs and performing any possible code optimization. With wide variety of technology background and with the ability to learn any new technology in a very short time I would add great value to your organization.
6.5 years exp in ETL, Datawarehouse, Informatica, Big Data, Hadoop, Hive, Hbase, Sqoop
10+yr experience in the field of IT and bigdata technologies like hadoop, mapreduce, sqoop, hive and also experienced in various analytical tools like R, Pentaho, datameer, tableau, etc
Â Big data professional having 8+ years (2.5 years on Big data) experience in IT industry. Â Hands on experience on various components of HDFS ecosystem like MapReduce, Hive, Pig, Hbase, Sqoop, Flume. Â Successfully completed 2 projects on Big Data Hadoop. SPOC during the entire delivery of the project from inception to post implementation support. Â Expertise in imparting trainings on Big Data Hadoop. Successfully imparted trainings in this area. Â Very handsome experience to groom up fresh graduates of the project. Â Diversify professional experience, 3+ years in academics, 8+ years in IT industry. Â Well verse in SDLC, also understanding of agile methodologies. Â Good experience in software estimations based on Big Data Hadoop. Â Multi technology experience helped me to integrate Hadoop to legacy systems.
I have more than 2 years experience in hadoop/mapred/hive/sqoop.
8+ years of total IT experience in BPM & Hadoop Big Data ecosystem. Holds following certifications in Big Data & cloud computing. ÂCloudera Certified Hadoop Developer (CCDH410) ÂCloudera Certified Hadoop Admin (CCAH410) ÂCloudU Â Rackspace cloud certification(CloudU) ÂSun Certified Java Programmer (SCJP) -Experience in Hadoop Big Data ecosystem Â MapReduce1, YARN, HDFS, Pig, Hive, Sqoop, Oozie, Hue, Zoo-Keeper etc. -Experience in multiple Business Process Management (BPM) products like Savvion Business Manager, Oracle BPM, Oracle SOA Suite (BPEL) & Business Rules Management System -Extensive experience in analyzing, process modeling, designing & implementing complex business processes using BPM technologies. -Experience of end to end BPM, SOA, J2EE, BRMS project delivery for many Fortune list organizations like D&B, GM, Motorolla , Reliance in USA, Singapore & India. -Sound experience in J2EE technologies like JSP, Servlets & frameworks like Struts, Spring, iBATIS.
Having experience in Shell Scripting,Java,Struts,Spring Web Services,Hadoop,HDFS,Pig,Hive,Sqoop,Flume,Zookeeper,Hbase.
Hello, This is Vidyasagar G. I have been working in Big Data platforms from 5years till now. Member of Apache Software Foundation and Hadoop+Hive Committer. Worked on 9 projects with clients across globe, like Walmart, TheHomeDepot, StateFarm Insurances, FedEx, DHL, etc. Have working experience in Hadoop, Hive, Pig, HBase, Cassandra, MongoDB, Voldemart, Sqoop, Flume, RabbitMQ, Oozie, Azkaban, NodeJS, HTML 5, Java, J2EE, MySQL, Oracle. Skype ID: Vidyasagar.gudapati Best Regards, Vidyasagar
? 9+ years of experience with strong emphasis on Design, Development, Implementation, Testing and Deployment of applications. ? More than 2.5 years of experience in Hadoop ecosystem & Big-Data Analytics. ? Expertise in core Hadoop and Hadoop technology stack which includes HDFS, Map Reduce, Pig, Hive, Sqoop, HBase, Oozie, Storm, Flume, Kafka, Spark and Zookeeper. ? Hands on experience on Hadoop ecosystem components like Hadoop Map Reduce, HDFS, Pig, Hive, Sqoop, HBase, Storm, Flume and Kafka.
Highly skilled Hadoop professionals - Experts in HDFS pseudo-cluster development, MapReduce programming, HBASE, HIVE, PIG, IMPALA, ZooKeeper, R, SQOOP and various other tools related to Data Science and Analytics. Formed in 2014.
Technical Skills Operating Systems: Windows, Linux, Unix, IBM iSeries i5/OS Skills: QE Automation, Server-side Testing, QE Manual, Project, Product and Process management Programming Languages: Core Java, Python Scripting Languages Perl, Unix Shell, SQL, Ant, NAnt Automation Tools: Selenium Web Driver framework, JUnit, TestNG Project Management Tools: IBM Rational Team Concert, Rational Clear Case, IBM CLM, Tortoise SVN, MS Office Bug Tracking and Test Case Management Tools: IBM CMVC, Mantis, Bugzilla, Testopia, DevHound Message Queue Tools: Glass Fish Server, QBrowser Databases: Oracle, MSSQL, MySQL and PostgreSQL Database Tools: Pentaho, SSIS, SQL Developer, Squirrel Client BI Reporting Tools: Eclipse BIRT, SAP Crystal Reports, IBM Cognos, JasperSoft iReport Big Data: Hadoop 1.x and 2.x, HBase, Hive, Pig, Zookeper and Sqoop Domains: Monitoring, Business Intelligence and Analytics, Database, Big Data and Hadoop, Travel and Expenses
? Having 3.6 years professional experience in IT industry. ? Having 2.0 years of Exclusive Experience as a Hadoop Developer. ? Strong knowledge of Development in Hadoop Architecture, HDFS, Hadoop Cluster, Map Reduce API ? Experience in Data flow languages like Pig Latin, developing Pig Scripts and writing UDFÂs. ? Strong knowledge in writing Map Reducing jobs using Java Map Reduce API . ? Having experience in query language Hive, HiveQL and UDF's . ? Hands on Experience in migrating RDBMS data to HDFS and Hive using SQOOP. ? Expertise in UNIX Commands and Bourne Again Shell Script and Oozie.
I have 2 POCs experience in Hadoop and its eco system. I have experience in Hive, HBase, Sqoop, zookeeper,HDFS and MapReduce concepts.
Working on Hadoop, Java, MapReduce, Hive, Pig,Sqoop and Oozie from past 3 years.
Have a strong experience in Hadoop cluster setup in AWS and other components of Hadoop ecosystem. Also very good in Mapreduce, Hive and Sqoop
My self Premsagar Gadapa started IT career in july 2011, Having 2.5 yrs of IT Experience in Software development. I have been working on different technologies like Bigadata Hadoop, Cloud computing and Ab Initio. Mainly am looking forword to work on Hadoop, i have good knowledge and hands on experience of Hadoop Ecosystem. Having experience of Hadoop installation and Configuration.Good Understanding and Hands on experience of Hadoop stack (Mapreduce,Pig,Hive,Sqoop and Flume).Had done quite a few POC's and Projects. Have Good knowledge of Cloud Computing(EC2 and S3).
A seasoned J2EE Designer/Developer and Big data Engineer with success in the design and implementation of products in Trade Finance ,Supply Chain Finance, Trade Service Utility (TSU),Credit Risk,Market Risk and Wholesale risk Currently working on predictive analytics for one of the U.S largest bank. Have extensive hands on experience in Hadoop,Map-reduce, Hive,Pig,Sqoop,Mahout,Spark,Crunch,Avro,Storm and Kafka.
Big Data :- Apache Hadoop : MapReduce Programming,YARN,Hive, Impala, Phoenix NOSQL: HBase,Cassandra Apache Spark :SparkSQL,MLLib,Spark-streaming Languages: Java,Python,R,Scala
6 months of experience with BIG DATA HADOOP, HIVE, PIG, SQOOP, Python
maexadata (pronounced mek-za-data) is a technology company based in Mumbai, India. maexadata designs and develops applications oriented towards getting the maximum productivity and utility from Big Data. We understand how to get the best of the HadoopÂ® architecture and the HadoopÂ® eco-system be it map-reduce programs in Java, or use a combination of Pig & Hive for your unstructured data or use Sqoop & Hive to integrate your SQL data into HadoopÂ® environment. More over, being also proficient with H-Base & R-Programming, we understand and know what is required by the industry in terms of data analytics. We take a lot of pride in offering complete technology solution covering hardware, software & connectivity. Our high performance delivery is backed by knowledgeable consultants and responsive support team. Our single-point-agenda is and always will be customer delight on a continuous, ongoing basis.
I have good exposure to different technologies right from core and advance Java upto Hadoop and tableau reporting. My aim is to work on tougher tasks and deliver the best to the clients mean while getting grip over the edge technologies
14 years in software development. Expertise with following technologies - .NET, C#, ASP.NET - Java / J2EE with servlets and JSP only - BigData with Hadoop and related technologies - MapReduce and development experience with common patterns including Summarization, Joins, Sampling, Data Organization, Filtering etc - Pig - Hive - Oozie - Sqoop
Â3+ years of experience as Java/J2EE developer. ÂMongoDB for Java Developers certified.
12+ Years of IT experience. Played various roles such as developer, technology lead, SOA architect, Lead BigData Architect etc on very large projects with fortune 500 customers. 3+ Years experience in Hadoop ecosystem. Deep technology understanding, architecture and design skills, hands on experience. Implemented data analytics on datasets ranging 10TB - 30 PB (Yes, I mean Petabytes) Installed hadoop clusters on EC2, Redhat, Centos on nodes ranging 10-2000 Innovative thinking, rich field experience, deep technology expertise and very creative.
Got data ? Need answers ? Structured or unstructured data I can pull the information you need. Click stream, log files, sentiment, sensor, geographic. What ever data you have and need answers from, I can help.
To bring together the knowledge of Networking, programming, Cloud and Open-source tools. Extensive knowledge of various networking protocols of TCP/IP. Worked extensively on Ubuntu, Backtrack, Fedora, CentOS. Implementing Map-Reduce framework in Java using Hadoop on multi-cluster nodes. Specialties: Network Security: Network intrusion and detection.Metasploit, nmap, netcat, HackerDefender, Rootkits Languages :C,Java, SQL, HTML, XML Scripting: Perl, Bash, Python. Other : Hadoop, Map-reduce, Hive, Pig, HBase, Apache Maven,Oozie, sqoop, Apache Ant, ETL process, Apache httpClient, JBOSS, tomcat, openTDSB Tools:Eclipse,Netbeans, Karmapshere Business Intelligence:MicroStrategy
Working on Big Data Projects, would like to do Big data Projects.
Hadoop Developer with big data technologies , having 3 to 4 years of experience in designing ,implementing hadoop in various firms like bank of america. and big data enthusiastic
Goal: Make Big Data & Cloud the Lifeblood of Enterprises and I want to ensure I'm at my best when coaching the next person who needs it.
i have 2 yrears of hadoop experience. i luv to take chalenges
HI, I've good experience with the Hadoop and its eco system components.
Â 15+ years of seasoned experience in SDLC (Software Development Life Cycle). Â USA Citizen with Active MBI (Minimum Background Investigation) Clearance. Â 3 years of experience in Big Data technologies (Hadoop HDFS/Map-Reduce, Pig, Hive, Flume, Sqoop, Oozie). Â Good understanding of NoSQL databases: HBase, Cassandra, MongoDB. Â Proven expertise in architecting and implementing software solutions using Big Data, ESB, BPM and SOA. Â Experienced in both Waterfall and Agile Development (SCRUM) methodologies. Â Domain experience in Federal Government, Consumer, Pharmaceutical, Financial, Insurance, Medical Devices, Telecommunications and High-Tech industries. Â Excellent analytical and communication skills.
I'm passionate about bigdata technologies, especially Apache Hadoop. I heard for the first time about Apache Hadoop when I was at my master courses and from that time it was fascinated about bigdata world. My first big professional success was when I introduced the Apache Hadoop technology into the company I'm working for, skobbler, in 2012. From that time I'm working as a full time devops on bigdata projects. My current working involve to well manage and maintain the Hadoop and Hbase in-house clusters. From this passion, I have initiated a bigdata/data science community in my town, Cluj-Napoca, Romania, with the goals of meeting new passionate people, working together on cool projects and helping IT companies to adopt bigdata technologies. Until now we had many meetups and workshops with a lot of participants.
Highly experienced data scientist and system architect specialising in processing large data sets. I have great technical experience in the Hadoop ecosystem of products, plus Oracle RDBMS and Documentum ECM. Machine learning expertise in Natural Language Processing and Image Processing.
I am IT professional worked for over more then 10 years for Oracle Corporation and having total of 17 years of experience in Technology Consulting Services & Product Development, Project /Program management, Pre-sales, Business Development, Embedded Applications in the areas of Middleware, Java/J2EE,Bigdata,Data warehouse and Oracle Apps space for North America, EMEA and APAC markets. Have worked on Healthcare /Lifesciences, Manufacturing and BFSI domain.
Princeton IT Services, Inc. is a leading technology consulting firm that specialized in Oracle Database Technology Solutions. Our services cover database consulting, information security, software development, system automation and staff augmentation. We provide guidance throughout all stages of the database development process.
Cloudera certified big data developer with 7+ years of IT experience and 3 year s of working experience on big data. Worked for major development projects on bigdata.
Working on Hadoop Projects
I am Data Warehousing / Database developer having 8 years of experience with an enterprise perspective. Offering planning, developing and implementing architectural concepts, strategies, road-maps, optimizations in data warehouses and databases. I am also a Cloudera Certified Hadoop Developer and Oracle Certified database associate.
7 Yrs of experience in PL/SQL, Hadoop, Hive, Pig and Big data
I am a professional .Net developer and have worked on several automation projects. I have expertise in Microsoft Surface and Kinect Technologies for motion and voice detection. I also design apps for iOS platforms with full sounds and voice overs I am a novice in the field of Big Data using Hadoop.
Cloudera Certified Hadoop Developer Certified Oracle Java 6 Can work 5 hours in a day on Java/J2EE project.Having experience in set up HDFS cluster and map reduce programming using Hive,Pig.
To succeed in an environment of growth and excellence and earn a job which provides me job satisfaction and self development and help me achieve personal as well as organizational goals.
I am a data engineer and specialize in processing of big data on the Hadoop Stack. I have written production level code and built complex ETL pipelines over hadoop stack that process data upto 200 TB.
Unicorn Vietnam is a start-up who is developing a Telematics app for Vietnam Insurance market. Users downloading that app have scores and feedback about their driving practices to help them drive safer. We help drivers to get better insurance package from the insurers, based on their good driving practice. Unicorn Vietnam also has a good Big Data training program and resources. We have a 5-node Hadoop cluster using Cloudera as a Lab at our office. Our main product - Telematics app uses Big Data profoundly. We are participating in Kaggle AXA Telematics challenge and among top 100 https://www.kaggle.com/c/axa-driver-telematics-analysis
Software professional, worked on enterprise, web, Hadoop ,JAVA, J2EE,mobile applications.. Now focusing on Hadoop, BigData. 5+ Year IT Experience . Freelance Consulting on BIGDATA & HADOOP DEVELOPER / ADMIN.
With more than 15+ yearsÂ experience, I have gotten in-depth understanding with Microsoft technologies, delivered solution for different domains. Proficient in accomplishing the projects beyond its quality and standards. Being Microsoft forum contributors and a certified professional, I have proven history of healthy client relations and meeting up timely deliveries.
Â 3+ years of experience in the IT industry. Â 1+ year of live exposure in Big data , working extensively on Hadoop, PigLatin and MapReduce. Â Worked extensively as ETL Developer, handling Development and Production support of Data Warehouse applications for AppleÂs iCloud. Â Proficiency in Data Analytics, analyzing the weblogs generated by various web applications. Â Expertise in the various analytical functions provided by oracle. Â Develop, test and maintain ETL procedures employing custom PL/SQL. Â Performance Tuning of SQL queries. Â Expertise in writing OS level tools, which interact with oracle by using UNIX korn shell scripting. Â Experience in data extraction, data migration and Data loading. Â Developed Reporting Automation Framework using shell script and Map-Reduce (Java) and PigLatin, to processes Terabytes of data in minutes. Â Experience in substituting the various analytical functions of oracle in Map-Reduce/PigLatin.
Â More than 10 years of IT Industry experience managing large IT applications and providing technology solutions to the large enterprises Â Competent in various technologies ranging from legacy mainframe (COBOL, JCL, VSAM, DB2, etc..) to the latest Big Data technologies (Hadoop, HDFS, Pig, HBase, etc..) Â Expert in delivering legacy modernization solution which involves downsizing customer's mainframe in a short span of time via solution transformation, application and data migration to flexible and agile Open Systems platform resulting in savings up to 30 million dollars Â Seasoned professional in Solution Presales including Technical presentations, Application Assessment, Estimation, Demo, Proof of Concept for Legacy modernization, Database, and Middleware products Â Experienced in understanding and analyzing customer requirements and preparing accurate estimates for project implementation including the costs for Hardware, Software licensing, and Professional services
I am having more than 9 years of experience in java and j2ee technologies. I have been working in latest frameworks like Struts 2, Spring, Hibernate , EJB and JPA and also having strong experience in java architecture to make more scalable big applications.
More than Seventeen Years of experience in the software services industry. Currently heading the offshore delivery for a ODC in the retail domain Certified Project Management Professional from PMI.org from 2006 till date Certified Scrum Master from Scrumalliance.org valid till Aug 2012 Ability and experience in setting up large delivery units, setting financial targets and monitoring, controlling the delivery to the set goals. Depth and Breadth of technology with experience and understanding of Microsoft Technology stack- .Net, SharePoint, SQL Server. Hands on Experience in Big Data technologies like Hadoop, Map Reduce, Pig and Hive Experienced in NoSQL DBs HBase and Mongo DB Specialties: > P&L Responsibility, Operations Excellence and Optimization, Delivery Management
Specialties: Big Data, Business Analytics, Java, J2EE, Cloud/Distributed Computing, Data exchange Professional Synopsis: A professional with over 6+ year of experience in Business application development, Business Analytics and Enterprise application integration. Excellent exposure in data interchange and big data integration/adoption. Building big data enabled teams from scratch. Managing POCs, designing solution architecture and technical pre-sales. Having command on an array of tools/techniques across all the tiers(EIS, Data consolidation, Business, Presentation etc) of modern applications. Goal: Designing and implementing solutions towards radical challenges and disruptive innovations in the field of Big Data, Business Analytics, Distributed Computing, Service Oriented Architecture.
I am working as a Research Analyst, I claim to be a Big Data expert. I have been working on HADOOP for 3+ years. I have worked on CDH3, CDH4, hadoop-2.2.0 and hortonworks sandbox. Having expertise in both hadoop administration and hadoop development. Also worked on other big data technologies like Apache Storm and Apache Spark.
I have rich hands on experience on several Big data technologies as i have worked on research group .
I am an 8 years experienced DW & BI Professional, having worked for MindTree Consulting, Getronics, Bank of New York Mellon . I have a good understanding of DB, Data Integration, BI space. Good Analysis of the requirements, Planning & Designing of the Project, backed by good documentation of my Projects is my Current Focus.
We are not any web development or excel / tableau based Data Analytics Company. We solve real problems, problems where most tools simply fail. In BigStem we understand the Data and its for 4 V dimensions: Volume, Velocity, Variety, and Veracity. Volume: Collecting and Storing Gigs and Tera Bytes of data . Velocity: Time series real time explosion of Data from hundreds of sources. Variety: Comes in several forms and format: structured/ unstructured or encrypted and compressed. Veracity: Accurate ( with some percentile) in face of huge data. We are expert in Hadoop family and analytics mindsets. We are expert in Descriptive Analytics ( we provide flexible visuals & content aggregations using Statistical Tools and Models for Domain Experts to make decisions.). We create Simulation Analytics ( we create query models can tune simulations and generate results (like Logistic Regression, K-means or nearest neighbor model abstracted with query wrapper.)
12+ years of experience in business/ data analytics. Expertise on the algorithms like cluster analysis (K-Mean, Hierarchical clustering etc), Bayesian algorithm, predictive analysis, Simple and Multivariate Linear regression techniques, Neural Networks using the SAS, R, Mahout. Strong in data visualization, building data science proof of concept, communicating with multiple stakeholders, handling projects in parallel and being able to explain complex mathematical models in a simple visual way.
Looking forward for working with clients
Digitronics is providing cutting edge solution to any web based application.This includes best support through out over all journey from requirement gathering to deployment support. We are not only providing solution but also providing best customer experience by ensuring 1. Well documented Requirement Specification(Visual Presentation) 2. Over all E2E implementation(includes system Architechture) 3. Testing statdard equivalent to indutrial metrics. 4. User manual. 5. Easy Maintenance with exciting discount rate.
We are RaaniSoft Consultancy Private Limited , We are a team of 5 experts dealing with Big Data Analytics
Currently i am working as web developer on ecommerce platform and done certification Big data hadoop
I am an individual haing 10+ years of experience on JAVA, J2EE and Hadoop, MapReduce platform. Currently working with a investment bank.Looking for some work on application development on any java based technology or framework.
Having more than 15 years experience on distributed computing, Cloud computing, Big data. Experts on capacity planning, batch processing, streaming.
Technowells LLC is a Software Programmers, SQL Server 2005/2008 , SSIS,SSRS,SSAS ,ASP.NET, Web Design & Developers, IT Consultancy company Technical Skills: ---------------------- SQL Server 2005/2008, Visual Studio 2005/2008, .Net 2.0/3.5, EntitySpaces.net, SSRS, T-SQL, SSIS, DTS, PHP, MYSQL, Java Script, Theme and Skin, AJAX, Photoshop, Flash, Silverlight, Linq, MS Access, Oracle, TFS, Subversion Big data, Hadoop, Hive, Hbase, Pig, Flume , Map Reduce, Sqoop, oozie, Hue, Cloudera, NoSQL, Area Of Expertise: ------------------------ Client Server Application development Desktop/Windows and Web Design and Programming C# and Vb.Net Programming PHP/MYSQL Web Design and Programming SQL Server database T-SQL programming SQL Server Database Administration SQL Server OLTP/DW Performance Tuning SQL Server SSIS/DTS and ETL SQL Server Data Migration, cleaning and scrubbing. SQL Server Data Architect/Modeling
I am experienced software engineer with a masters degree in computer science. I have expertise in java and big data technologies like hadoop and map reduce. I am interested in part time/side projects and only apply for posts that interest me. I have worked at a couple of big companies like Amazon, building data processing and analytics platform. The skills i have added are the things i am hands on or have significant experience.
I am trained on Big Data technologies like Hadoop,Hive,Pig,etc. I have done multiple proof of concepts on the same. Also, I am excited to work on any promising technologies.
we work very professionally & deliver client project on timer.We provide our best services with minimal cost. We have served global clients from (US,Japan, UK, Germany, India etc.) Please look our portfolio at our site for reference work.
About 15 years of experience in IT application development on various databases, as developer to solution architect. Big data and advanced analytics and descriptive, predictive and prescriptive analysis. On time and high quality deliverables
Banking domain knowledge banking data structure hadoop eco system
Â Strong work experience in design and development of Big Data Applications using Hadoop and its components like HBase, Hive, Pig, Oozie. Â Strong knowledge in performance optimization of Big Data Applications for efficient & optimal resource utilization. Â Experience in capacity planning and migration of critical legacy Applications/Products on Hadoop.
Hi, This is Ramesh. I have done my masters in Computer Application. I started my career with SQL Server database 2000 administration and development. I am a certified Oracle Database Administrator Professional as well as Developer. I have an hands on Siebel CRM 7.7. Currently I am in Hadoop Administration and development and has built my Hadoop BIG Data Centre with multinode cluster of cross platforms using commodity hardware(Pentium 4 machine). Working on some use cases on the BIG DATA Analytic.
Working in Bigdata - Health care analytics project. Completed 2 POC with Hadoop, Hive and MongoDB Certified Hadoop Admin Certified MongoDB developer
. 9+ years of exp. in Pentaho, Talend, Informatica ETL, Jasper Server - Reports,Dashboard, Adhoch, Analysis View and Domain Administration, Pentaho PDI (Kettle ETL) and Administration and Qlikview,Crystal Reporting. . Sound knowledge in Report, Jasper Server Administrating, Ad Hock Reporting,Pentaho Administration, ETL & DW Development. . Possess good Knowledge in SQL Queries, ETL, Data-warehouse concepts. . Efficient in Word and Excel to prepare user specific requirement analysis. . Posses good team management skill and has experience in leading a team. . Strong team player and has lead teams at client-site and offshore. . Scaling open source DWH using Cluster & NoSQL concepts (Hadppo, MongoDB) Specialties: Informatic ETL, Business Object Info view, Jasper Server - Reports,Dashboard, Adhoch, Analysis View and Domain Administration, Pentaho PDI (Kettle ETL) and Administration and Qlikview,Crystal Reporting.
Highly organized, dedicated, self-motivated and accomplished IT professional with 2 Decades of Architecture and development experience and deep knowledge of all aspects of Software Development. Expert Level skills in various technologies.
Working for small companies allowed me to wore different Hats in my career. Luckily my focus was more towards data. Few hats i am proud of are BI/Hadoop Administrator, SAP/Oracle BI ETL developer, BI Report Designer/Developer, SAP FICO Payroll configuration analyst.
I have intermediate to expert level knowledge and working experience on Big Data, Hadoop and its ecosystem, Java, Spring XD and integration with RESTful APIs. I am looking for a suitable role to explore more on the big data and advanced analytics space.
Xenolytics is a Big Data company providing Products, Solutions, and Services to generate actionable Business Intelligence for Making Better Decisions by analyzing massive amounts of internal, external, structured and unstructured data. Products and Solutions are built on industry standard open source Big Data infrastructure and solutions are versatile and customizable to different needs of the enterprises. The team has decades of experience in Business Intelligence, Telecom, Supply Chain, Pharmaceuticals, RDBMS, NoSQL, and Big Data eco systems.
IntelliGrape Software is a premium technology company with proficiency in Big Data technologies like Hadoop ecosystem, R, RMR, NoSql, Machine Learning, Visualization etc. Highlights of our Big Data practices: ? Cloudera Certified Hadoop developers and Administrators ? DataStax Silver Partner ? MongoDB Advanced Partner ? Proficient in Hadoop Ecosystem, Machine Learning,Visualisation and ETL Tools etc ? Experts in NoSQL data stores ? AWS certified architects ? MongoDB certified developers
Strategist and innovative software professional with strong record of delivering disruptive products. More than 3 years of experience on Big Data Technologies, Development and Design of Hadoop ecosystem based enterprise applications. Cloudera certified developer for Apache Hadoop and 10gen certified developer for MongoDB. TECHNICAL HIGHLIGHTS ??? Big Data Technologies ??? Hadoop / PIG / Hive / Mahout / Flume / Sqoop / Oozie / MongoDB ??? Web Technologies ??? HTTP / REST / Caching / ??? Java / C++ / Data Structures / MySQL / Memcache ??? Software Architecture / Machine Learning / Framework Specialties: Data Mining & Analytics, Content and Search Platforms, Cloud Computing and Storage, Machine Learning, Consumer Websites and Security Working on Building Big Data Frameworks, Solutions and Algorithms with Hadoop and related technologies. These include: - Data Pipelines, EDW and ETL - Hadoop and Big Data Infrastructure - Data Analytics - Recommendation Frame
15+ years experience. MS, Computer Science, Johns Hopkins Univerisity. MBA, Marketing Big-Data, University of Missouri, Kansas City