One of my client Motivated me to follow these golden guidelines. 1. Keep in mind that most of your clients want to feel appreciated. 2. Always treat your clients like you want to be treated. 3. The most successful entrepreneurs are those who know how to treat clients. 4. Every client is a potential referral - always remember that. 5. Continue to work on your business savvy and NEVER let money be your motivation (driver). 6. This creates a "lack mentality" and you will always have to scrap for money. 7. Prosperity comes from giving your BEST every single time (even when you don't want to).
I am a researcher and software developer interested in machine learning, data mining and robotics. I have commercial experience with Big Data processing via Hadoop, HBase, Pig and using of machine learning with matlab, R and python.
Vertizone provides BI and Big Data solutions. We have successfully designed and delivered working BI and data warehouse solutions using Pentaho, Talend, MySQL, AWS, RDS, Redshift, Hadoop. Consulting Partners with Amazon AWS, Pentaho, Cloudera, SpagoBI.
I am an experienced hadoop developer with good knowledge in Mapreduce and Hive . I have worked in a leading IT firm for 2 years as a hadoop developer. I am interested in doing Hadoop related projects.
Carrying 3.7 Year of Experience with variety of technologies like, Php(Magento, Joomla, Codeigniter), Ruby on Rails, Backbone js, Coffeescript, Hadoop, Pig, Mapreduce programming, Jquery, Bootstrap, css etc.
Hi there, Manoj Sahu basically from India Almost I hv 3.5 year exp. in IT sector. I did MBA IT.My Passion is IT only. Also I certified Redhat and MCITP. Now I am switching for Hadoop/Bigdata.- I am looking for the part time where I can grow my knowledge.
I have 8+ Years of Experience in Software and Web development. I have good experience in design and development as I have worked on various kinds of projects like Social Media Monitoring, Search technology, Financial and web site development using Java-J2EE, Hadoop, Hbase, Map Reduce, HBase, MongoDB Lucene, Spring, Hibernate,Struts, SOAP and Restful Web services and MySQL.
Hi, Please find attached details and resume; I am working In Capgemini Consulting India Pvt Ltd, Bangalore. I have 7+ years over all IT experience and 2+ years of experience in Hadoop ecosystem and remaining in java technologies. I would like to apply for the position with reference to your mail. Regards, Ravi Sankar Mobile:+91 --
I am a business intelligence and data warehousing expert and consultant. I have 8 Years of experience in SAP BI/BOBJ, 2 Years in SAP HANA, 2 Years in Microstrategy, Tableau. Qlikview and Hadoop. I also have extensive experience in Project Management using Primavera P6 Planer Scheduler.
Accure is a solution provider for Hadoop and related big data technologies. We provide big data advisory service and help customers understand data processing challenges and advise on technology selection. Our core team comprises of thought leaders and experts in Hadoop, NoSQL databases, and related big data technologies. We work with our customers to build solutions from proof of concept to production.
Xbeon provides cutting-edge business solutions in the field of bid data. Xbeon understands how to integrate and support complex Hadoop environments by providing custom solutions that can be tailored to your budget. This means we learn your business requirements quickly and clearly. We then work closely with you, through a unique blend of client managers, engineers, and staff, to determine how best to add value to your enterprise and create a customized solution.
Java Programmer having 7+ years of experience in web, mobile and hadoop applications development and integration.
Senior technologist with strong business acumen and technical experience in Big Data space. Results oriented, decisive leader in Big Data space that combines an entrepreneurial spirit with corporate-refined execution in Big Data strategy, Big Data consulting, implementations, CoE setup, architecture, pre-sales and revenue optimization. Adept strategic partnerships builder across technical and global businesses for Fortune 500 customers. Areas of expertise include: ~ Hadoop Administration, development ~ NoSQL and real time analytics ~ Big Data/Hadoop solution Architecture, Design & Development ~ Social Media, Community & Viral Marketing ~ Business Development, Global Scale ~ Big Data strategy ~Hadoop Training Specialties: Business Strategy Consulting, Planning & Operations in Big Data space, Big Data Architecture, Big Data/Hadoop implementations, Hadoop Administrator, NoSQL/HBase/Cassandra, EDW
Having 13 years experience in java/j2ee. Credited for executing enterprise projects covering various aspects of software development, processes and methodology.Proficient in working on emerging technologies like (Hadoop ecosystem and NoSQL).
Over eight years of experience in building Java-J2EE enterprise applications. Expertise in configuring Cloudera Hadoop cloud platform and analyzing large data set using MapReduce, HDFS, Hbase, Pig, Hive, Sqoop, Flume, Oozie workflow scheduler, Lily Content management system and NoSQL CouchDB. Experience in using Twitter Storm, Nokia Dempsy and Esper for real time data analysis. Expertise in developing distributed applications on Amazon cloud platform and automating deployment process using CloudFormation and Chef in Linux environment. Experience in developing geographic information system (GIS) web and desktop application using Oracle spatial, ESRI ArcGIS and OGC-based open source platform. Over five years of experience in Agile driven development.
To bring together the knowledge of Networking, programming, Cloud and Open-source tools. Extensive knowledge of various networking protocols of TCP/IP. Worked extensively on Ubuntu, Backtrack, Fedora, CentOS. Implementing Map-Reduce framework in Java using Hadoop on multi-cluster nodes. Specialties: Network Security: Network intrusion and detection.Metasploit, nmap, netcat, HackerDefender, Rootkits Languages :C,Java, SQL, HTML, XML Scripting: Perl, Bash, Python. Other : Hadoop, Map-reduce, Hive, Pig, HBase, Apache Maven,Oozie, sqoop, Apache Ant, ETL process, Apache httpClient, JBOSS, tomcat, openTDSB Tools:Eclipse,Netbeans, Karmapshere Business Intelligence:MicroStrategy
Specialized in: Hadoop and HBase cluster setup and configurations. HDFS and HBase programming. We have built a highly scalable and performance Amazon S3 like cloud storage system (API compatible). It has been used and deployed by several enterprises. Please contact us for more details.
I have extensive experience in Big data Analytics including Hadoop , Hive , Impala , flume, Hbase , Sqoop, Solr. I have exhaustive experience in Machine Learning algorithms majorly around text mining , Predictive-Analytics (Classification , Ranking etc) . My total industry experience in 6 + years including 4 years on the above mentioned domain. I have also worked on Data Warehousing and BI tools like Informatica and OBIEE.
With multidisciplinary skill-set and experience in Analytics and Data mining, including big data with profound business knowledge in Banking/Finance, e-commerce, Supply Chain Management industries. Worked closely with Marketing, Retail, Risk Management business teams to create design, develop and implement analytical solutions for their business problems and helping them in bringing business impact to organizations. Hands-on experience in building mathematical, statistical and quantitative models to day-to-day business problems and bringing business impact to clients. Successfully executed small (3+ month) to Large projects (1+ years) for some of the top clients eBay, GE Capital etc., and was responsible for delivering end-2-end engagements and potential savings of $100K to $2.0MM.
Big Data Architect and Developer seeking focused projects for fulltime and part time work.
With more than 14 yrs of experience in wide array of software systems, working with global customers, we believe experience and vision is our key driver. We have extensive experience in Oracle SQL/PL-SQL, Shell Scripting, System analysis and design (Waterfall/Agile), Managing Support functions, Metadata driven software development and ETL sw development using informatica PowerCenter, Pentaho, Talend etc.
Our expertise includes Big Data, Databases, Search, Machine Learning and NLP applications. We have experience in following areas: * Hadoop - Cloudera(CDH),Hortonworks(HDP),Apache Hadoop, MapR Hadoop * Hadoop Eco system - Pig, Hive, Flume, Zookeeper * NoSQL - HBase, MongoDB, Cassandra * RDBMS - Mysql, PostgreSQL * Monitoring - Ganglia, Nagios * Cluster administration - Cloudera manager, Puppet * Cloud computing - Amazon EC2, Amazon S3, DynamoDB You can checkout our entire portfolio @ www.sigmoidanalytics.com We have experience in following technologies: - Hadoop, Pig, Hive, Mahout, Sqoop, Flume - MySQL, MongoDB, Cassandra, HBase, CouchDB - Solr/Lucene, ElasticSearch, Nutch, Scrapy - R, Gate, Apache Stanbol, Tika, Jena, Jackson, Virtuoso, Weka, RDF, SPARQL, JSON, XML, Dbpedia, Ektorp, RESTful interface - Django, Drupal, Wordpress, Magento, - Amazon AWS, Windows Azure, Google App Engine - APIs: Facebook, Twitter, Linkedin - iOS, PhoneGap, Android
Over 15 years of experience in providing Architecture and Software solutions using Microsoft (C#, .NET, ASP.NET, MS SQL etc), Android, Iphone and Java technologies.
Over the Last 5 Years, i have developed India's second largest data center for a Norway Telecom company.Developed database architecture for application(s) built using java and Savvion. I have extensive experience in migration of database,defining backup policy,performance tuning of database(s) and assisting client to develop new application in terms of hosting server and database I have extensive experience in administration of Oracle(9i,10g &11g),Sql Server 2005/2008 and MySQL 5.5. I have extensive experience in Configuration of Amazon Ec2 instances, Amazon RDS and Amazon S3 along with Amazon Cloud Watch. Successfully completed 3 Implementation Project of Amazon AWS and Database Migration from Company Hosted Server to Amazon RDS Experience in Cross DBMS Migration(i.e Migration of Oracle to Sql Server etc)
I am a seasoned IT professional looking for some side work that is interesting and in Big Data.
I am working as professional software developer for last 5 years. I have worked as full time employee for IBM and Adobe Systems. I am Oracle Certified Java Programmer.
QBurst provides consulting and customisation services in Big Data under a broad reporting and analytics portfolio. As an early adopter of Big Data technology, we have the right skill sets to help organisations chalk out their Big Data strategy.
CERTIFICATION & TRAINING Redhat Certified Engineer ? RHCE on Redhat Enterprise edition 5.0 with 96% score Certificate Number --28234). Redhat Certified Technician ? RHCT on Redhat Enterprise edition 5.0 with 100% score Certificate Number --14649). Training on IBM DB2 at India International Institute of Management Jaipur Organized by IBM. IBM netezza certified developer. SUMMARY Posses expertise in Object oriented Analysis/Design in PHP, Java and Flex 3.0 (Actionscript), Skilled at progressing from problem statement to well documented design. Quick learner and dedicate to reach the goal.
? 4+ years of total experience in Data Warehousing - BIGDATA , ETL, NOSQL, Unix, Shell scripting in Telecom/ IT ? 2.5+ Years of Exp in HADOOP Administration and Developement ? 1.5+ Years of Exp in NoSql Technologies like Cassandra, Couchbase and Mongodb Administration ? worked on ETL projects Data integration, Data warehousing, Centralizing in to HADOOP DFS (Distributed File System) ? Enterprising leader with excellent analytical, organisational and interpersonal skills. ? Ability to understand and capture technical as well as business requirements. ? Experienced in business analysis (requirement gathering), preparation of process and specification of validations for converting business requirement into functional specifications. ? Recognized for good understanding of business needs, excellent communications, and strong client support. ? Ability to effectively perform multiple tasks in multiple projects, meet all target dat
I am a Cloudera Certified Administrator and Developer for Apache Hadoop (CCAH & CCDH), working with Big Data, and am seeking companies interested in efficiently and safely managing their rapidly-growing data. I perform my work for a company with a consultant attitude, whether I am a contractor or an employee. I consider the people I work with as my customers. This approach ensures I will always provide high-quality service. In addition, as I fulfill the needs of my customers, I strive to identify and fulfill needs previously unknown to them. This enables me to deliver a product or service that exceeds their expectations. I am very experienced in working with projects remotely. I enjoy automating complex and repetitive processes, to eliminate human error, and enable personnel to focus on higher-level responsibilities. Allow me to assist you in the success of your next project.
As the principal of Brevity.IE, I offer enterprise grade web and data applications development services using multitude of open-source technologies. Our entire focus in on data analyses using R and Hadoop; with Web as frontend . We use rapid development frameworks like Hibernate and Yii (for JSP and PHP respectively). We help deploying your application both in-premise as well as on AWS. With more than 23 years of software development & data science experience comes quick, reliable and relevant solutions. More importantly, we are able to understand the domain specific nuances of your requirements and be able to do full justice to it.
Indian Institute of Technology graduate. Management consultant and analytics professional. Expertise in excel, photoshop, illustrator, project management, data analytics.
¿ Having around 4.1 years of IT Experience ¿ Around 2 Years of experience in Hadoop Framework development ¿ Over 2.2 Years of experience in Data warehousing applications (OBIEE (Siebel Analytics)/Informatica),Microstrategy.
I am an expert software developer with experience in data mining and data science. My Kaggle Rank is 600 (top 0.4% worldwide) and rising. I have degrees in medicine and computer science, with masters degree coursework in statistics and data mining. I have built my own Hadoop cluster (http://www.jyrocluster.com) as a cloud service for other programmers.
I'm looking for part-time software development opportunites in (though not restricted to) Distributed Computing , Big Data Analytics , Machine Learning and Distributed Systems Technologies Proficient at : JAVA , J2EE , EJB , JMX Extentions, Servlets ,JMS ,JNDI , JDBC , Hibernate , Spring MVC , Hadoop , Lucene , HBase , Mahout, Pig Latin
I have done Master's in Engineering & Technology with Specialization in Wireless and Network Technology and I have hands on experience in Hadoop administration and development(MapReduce), Java, HTML, JSP, Cloud computing and various other technologies as mentioned in my skill set
We are providing solution to small business for application development , deployment and support. Contact us for Hadoop (Big Data) Consultancy, Training, POC / Project Development / Deployment / Troubleshooting Specialties: Big Data, Hadoop, MapReduce, Hive, HBase, Flume, Pig, Hue, Greenplum, Cloud Computing, Java
+ Migrating current infrastructure to Hadoop / HBase or Cassandra + Providing ETL design and architecture on top of existing infrastructure using Hadoop + Establising Workflows on top of Big Data + Integrating latest state of the art Big Data Visualization technologies like Talend, d3. + Help in deciding real-time to near real-time differentiating of online Analytics + Crisis Handling - Disaster Recovery, Namenode troubleshooting, Data loss prevention and recovery on HDFS + Training in the area of Big Data Technologies Regards, CTIIT.in
We are a fast growing well funded firm providing End-to-End Enterprise Cloud Computing, Virtualization and Systems Integration services with Strategic Consulting to leverage an organization's IT resources in a way never imagined before. Our approach ensures optimal Return on Investment and Cost Reduction Benefits with minimal overheads and operational costs, all the while maximizing the flexibility and scalability of the infrastructure. Our USP is our ability to analyze and utilize the existing setup and build on top of it thereby effectively integrating legacy physical infrastructure and services with those of the Cloud. Being an independent, unbiased and neutral authority on cloud, we strive to deliver impact and value through your journey of cloud readiness.
Distributed computing expert on Machine Learning, NLP, Social Media Data (FB/Twitter) processing and Sentiment Analysis using scikit-learn, weka, Hbase, Hive, Python, Lucene, Spark, Tachyon, BDAS, Spark, Hadoop. Architect - Big Data, Real time BI and Data science Solutions using Hybrid Lambda (MPP rDBMS + Distributed NoSQL) architecture. Proven track record in scalability and sub-millisecond response of High frequency Financial Trading (OLTP) applications.
Consulted Enterprises in end-to-end enterprise architecture, solutioning and mentoring experience on J2EE Worked in verticals domains Health Care, Banking, Insurance, Automotive and Retail. Also well versed with TOGAF 9.1 Methodology. Specialties: Enterprise Architecting/Solutioning/Consulting/Training with Java-J2EE, SCALA, Play Framework, Akka/Spray, HADOOP Ecosystem, Enterprise Analytics Implementation,SOA-REST Web Services, BPM with Process Servers, Cloud Computing.
I have been involved in usage analytics and predictive analytics research for smart health portals at http://bckonline.monash.edu.au. My work explores data-driven solutions to drive improvement in content management, system design, and decision making for portals. I delivered both front-end and back-end (Dashboard) components as a developer. I also performed many data analytics and data warehouse analyses. My previous works include Software Engineer at Electronic Arts, and Developer/Consultant for logistics project at SMU/Port of Singapore Authority.
Two successful SharePoint 2013 enterprise projects in production (-Feb 2013). Multiple FAST Search 2010 projects successfully delivered, multiple FAST ESP projects successfully delivered. Senior .NET architect/developer. Own infrastructure (Remcam LLC). Mahout/Hadoop for keyword/topic extraction from large corpora. View my profile on LinkedIn: http://linkedin.com/in/vanschalkwyk .
5 years experience in java web large companies now Baidu big data processing, familiar with java, hadoop, hive, struts2, hibernate, spring, python, ruby
Solution for hadoop, java, j2ee, hadoop hdfs and mapreduce and hadoop clustering.
I have been working as Software Engineer in an MNC. I have 4 years of industry experience of Software Design and Development. I have used Java for most of my development but I am also familiar with Python. Currently I am working in the domain of analytics and developing Hadoop applications. I have interest in Machine Learning as well and have completed couple of projects in that domain where I have created a recommendation system for StackOverflow and tool for analyzing tweets.
We are a group of people passionate about evolution of Big Data Storage and Analytics. The areas we are strong at are: - Hadoop Setup - Map Reduce Programming - Storage and Analytics - HBASE Instead of focusing on Money, our team has set our goal as knowledge acquisition.
I am enthusiastic,friendly,kind and honest.To work on Bigdata¿ is my passion.
TRAINED OVER 50 STUDENTS FROM US AND INDIA OVER 1 YEAR FOR BIGDATA AND HADOOP.
Over 10 years of Information Technology experience. Hands on practice with working with Relational Databases, Hadoop and writing and testing programs in Python, Shell, Perl and PHP. Also worked as a System and Network administrator and having a broad view over the modern IT technologies. Strong analytical and problem solving skills. Linked In Profile: http://www.linkedin.com/pub/jordan-velev/17/9b2/825
Contributor/Author/User of open source products in and around hadoop
I am Hadoop Developer having 4+ year of Experience.
5 years of industry experience in Linux/Hadoop Administration. Good in shell/bash scripting.
I have worked on Big data and java on several projects have good knowledge of hadoop, hive , hbase, cassandra , hbase, Neo4j , java
Distributed computing programing using Hadoop MapReduce using Java programing language Experience of data mining, predictive modeling, and Statistical computing tools such as IBM SPSS, SPM CART, WEKA, XLMiner, and R Familiarity with data preparation and pre-processing methods in the field of Data Mining. Good interpersonal skills coupled with proven experience communicating to business and technology stakeholders. Knowledge of Rational Unified Process framework for iterative software development. Working technical experience with relational database using Oracle, SQL, DB2 and DataCom. Expert of Object Oriented Programming language concepts and terminologies.
Mobile App developer and Hadoop expert
I have 15+ years of experience with data. More recently I was at Yahoo for 6 years working on Hadoop/Pig with big data. I can help you with all your data needs: - Analytics - Validation and testing - Migration - Modeling - Visualization
Master in Science in Computer Science having 6yr of experience in JAVA , J2EE using SPRING framework and HADOOP as well.
I am currently working as a Hadoop Developer since 2011. I have been part of many POCs and working on a project on IBM Biginsights for an US based insurance company.
Having worked in some cloudification projects, I've strong expertise in AWS, Hadoop, MR, MongoDB, C, scripting and algorithms. I've worked in wifi industry for almost 7 long years, having helped them move from single instance to a 'real cloud' where all the scalability has been thought of from the design phase to make it scale like google. I'd be very eager to work on various technologies, hope that my work would be appreciated.
Mysql Administration. Hadoop Admin. Hive. Pentaho for reporting
I have a java experience of 2.2 years with a relevant hadoop mapreduce experience of 1.5 years. I have worked on a project which makes use of the intenational design for direct batch processing of unsrtuctured data and it is a huge succes.
2 years experience in Internet developing(java).2 years experience in java ee development,3 years experience in crawler/spider development.2 years experience in search engine(lucene) development, 1 year experience in hadoop experience includes map/reduce programming,hdfs,hive,hbase,etc.
My knowledge of interest is of computers, i have completed bachelors of Engineering in computers and also masters in computer system security . Regarding my professional life i like working as an administrator hence i have chosen to go with oracle Database administrator and also with Hadoop. I have been working on these two skills from the past 2 years and have sufficient knowledge to complete tasks on my own.
Having working knowledge and experience of one year on handling Big Data with the technologies of Cloudera Distributed Hadoop(CDH4.1.2) and Map Reduce programming to process the data in Hadoop Distributed File System and its Ecosystem components like Hive0.9.0, MySQL, Sqoop1.4.2. Performance-driven Software Professional with over 3 years of experience in handling projects under Technologies Java, J2EE, Struts, Hibernate and Spring. Expertise in Object Oriented Concepts and Exception Handling. Currently anchoring multiple projects as Java-J2EE, Hadoop, Spring and Hibernate.
Cloudera Certified Hadoop Developer with 5+ years of experience of software development. Good in analysis and problem solving.
I am core expert in Server side Technologies & midelware Technologies and good experience in Hadoop ,hive,hbase,cassandra,flume,chukwa,yarn,ambari,cloudera,Thrift,Avro and Amazon ec2,s3,cloudwatch ,java and j2ee ,jsp,struts,hibernate,spring liferay,wso2 stratos,Pentaho ETL,Pentaho Data Integration
Expert: CloudComputing Hadoop Automation
Experience: 7+ Years in Java 3+ Years in Apache Hadoop Apache Hadoop Contributor
We are Expert team of hadoop having 3-5 years of experience.believe in customer satisfaction.Having experts in different technologies like Hadoop, MapReduce, Hive ,Java, Mongodb, PHP, Mysql, and Wordpress.
I am Sachin having more than nearly 7 years of experience in software development and testing. Education : BE(Electronics), MTech (Computer Science). Work Exp: Total of 7 years of exp (3.9 years in Altair Engineering --Domain: Grid Computing i.e C on Linux 1 year of exp in IBM's HPC team and Presently working in PowerVC-cloud computing- Functional verification Team Good knowledge of Workload Mangers and Job scheduling algorithms : PBS(Altair),LoadLeveller (IBM), LSF (Platform computing) Cloud management software : OpenStack (PowerVC) Past from 2 Months I am actively Participating in BIgData Hadoop training program by Edureka. Bigdata : Hadoop framework with MapReduce algorithm, non-SQL concepts, data analysis. Good in : C on UNIX/LINUX, Shell programming and scripting ,Socket programming , OS and Networking Concepts,Design patterns and C++ concepts.
8 years of Product Development Experience Specialty? Productizing Ideas - love starting from a blank page. Core Skills? Chronologically - Development, Design, Architecture, Building and Leading R&D teams. What Else? An Agile Fan. Currently Working On: Currently thinking BIG with Big Data and Distributed Computing. Technologies - Hadoop (and ecosystem), Lucene, Elastic Search, Graph Databases
We are a one stop shop for your data apps & analytics platform. We help build, operate and manage cloud agnostic (Amazon, Windows Azure and Rackspace) for your data apps and analytics. We support Tableau, Microsoft BI. IBM Cognos, SQL Server, MySQL, Oracle, Hadoop and Ruby On Rails competencies. We are a Microsoft, IBM, Tableau, Cloudera, Rackspae and Amazon Partner Provide all DevOps for analytics capability necessary to operate infrastructure is no easy task given complex technologies and increasing number of data sources, variety and volume of data. LogicMatter eases that burden for you. To then monitor and manage this infrastructure requires a broad set of capabilities and more importantly a dedicated capacity. http://www.logicmatter.com
working at netease, which is one of the biggest i.t. company in china (the agent of "World of Warcraft " in china.) and the www.163.com's world alexa rank is 23. i am good at java and has 5 years experience. Data analysis with hadoop, hive and storm is my prime job. i'm proficient in J2EE and mysql too.
Many years experience in Java web develop and Hadoop.
Hi, I have 5+ year of experience in android and 1.5 years in iOS app designing , developing and packaging in one of the leading software company . I also have 2+ years of experience in Hadoop ecosystem. I have 28 android apps published in google play store and amazon app center and many iOS apps created for different clients. I am confident that my credibility and skills will be helpful to provide complete solution and create value . I also have experience in Web Serivces , XML , JSON , Rest and PHP. I have also won Red-Had Innivation mobile app development contest in 2012-2013 Details of my published Apps: http://www.amazon.com/s/ref=a9_sc_1?rh=i%3Amobile-apps%2Ck%3Adroid+hub&keywords=droid+hub&ie=UTF8&qid=1383109890 Please find my portfolio:: - 5+ years in Android - 1.5 years iOS - 2+ years of experience in Hadoop - 5+ years in Java , Web Services , Spring , REST -Pig , Hive , Cassandra , Flume , Storm , MapReduce , Sqoop . - Certified Cloudera Hadoop programmer.
I am java hadoop programmer. I am well versed in hadoop, mapreduce, pig, hive technologies. I have also worked in java, jsp, ext js, hibernate, spring frameworks. I am also good at j2ee technologies using ejb, jms, jdbc etc. I have also worked in ETL and datawarehousing Projects. I have worked in TALEND tool also. I am also good at Object Oriented Analysis and Design using UML My passion is coding and writing code that matters.
I am Sai Krishna, finished my MS by research from IIIT Hyderbad. I have been working in the field of information retrieval and Information extraction for the past 6 years . I have been involved in building many large scale systems like web crawlers and web search engines for many languages. Some of achivements include building distributed search engine for XWiki.org as a part of my Google Summer of Code 2008. Conducted 15 days training session on Hadoop at CAIR, Bangalore, India. Lately, I have beening working on architecting and designing of a distributed application using HBase, Redis at the backend. I love exploring new open source projects/technologies. Have pursed my masters in the field of information retrieval I have a very good experience with Lucene and Nutch (for the past 6 years). Infact I have made many customizations to these projects and implemented several plugins. Also, I have been working on Apache SOLR for the past 2 years.
Expert of Hadoop & its ecosystem, Big Data and Cloud Computing. Contact me for Hadoop (Big Data) Consultancy, Training, POC / Project / Intellectual Property Development / Deployment / Troubleshooting Specialties: Big Data, Hadoop, MapReduce, Hive, HBase, Flume, Pig, Hue, Greenplum, Cloud Computing, Java
TECHNICAL EXPERTISE Languages : Hadoop, Map-Reduce Jobs, Java, XML, XSL/XSLT, HTML. Skills : J2EE - JDBC, RMI, Servlets, JSP, SQOOP, AVRO, OOZIE, FLUME, ZOOKEEPER, Amazon EMR and S3 Frameworks : Spring 2.5/3.0 and Velocity 1.4 Web Servers : Tomcat 5.5/6.0 and JBoss Application Server 5.1.0 ORM Tool : Hibernate 3.6.x. IDE Tools : Cloudera VMWare, Eclipse 3.x.x, JBuilder 2007. Software Tools : RSA for Modeling, TOAD for Oracle, Putty and Lotus Notes. Source Control : IBM Rational Clear Case, Sub Version and Microsoft VSS Databases : PIG LATIN 0.11, HIVE, HBase and Oracle 10g. Operating Systems : Windows XP, Solaris 9 and 10, Ubuntu, CENT OS, Linux CERTIFICATION Cloudera Certified Developer for Apache Hadoop (CCDH), CLOUDERA Sun Certified Java Professional (SCJP)
Hadoop Admin. 2 years hands-on experience. Cloudera( CDH3/CDH4)/Hortonworks distributions.. Experience with very larges clusters installing/administering the whole hadoop ecosystem including HDFS, HDFS HA, NN, SNN, JT,TT, Hive, HBASE, sqoop flume,monitoring, tuning. Oracle DBA. 10 years hands-on experience. Versions: 8i -11gR2: RAC, DATAGUARD, ASM, RMAN, Partitioning, upgrades, instance tuning, installations. RedHat sysadmin. 2 years hands-on experience. Installations, KVM, iptables, selinux, apache http server, shell scripting.
Software professional with 4.5+ years of experience in Analysis, technical design, development, integration, testing and project management.Expertise in all the stages of the Software development Life Cycle development namely Requirement Analysis, design specifications, coding, debugging, testing.Expertise in HadoopBig data technologies: Hadoop Distributed File System (HDFS), Map Reduce, PIG, HIVE, HBASE, ZOOKEEPER, SQOOP.Excellent exposure in interacting with clients & team.Expertise in obtaining the project requirements from user, writing system specifications, translating user requirements into technical specifications, preparation of requirements documents, formulating the requirements into design specifications and tracking project progress as contemporary architecture.Good Team player, having interest towards exploring, learning and using new Tools & Software quickly as required. Has been rated as Exceptional in the present company. Awarded as Best Employee across organization
¿ 9 years of overall professional experience, most of which on analysis, development, review and testing. ¿ Worked on Large Data Set using Hadoop, Pig, Hive ¿ Very Good work experience in Base SAS, SAS Macros and SAS SQL. ¿ Analytics experience using SPSS. ¿ Experienced in SAS Analytics and Statistical Modeling. ¿ Prepared ETL jobs using SAS DI Studio. ¿ Worked on Oracle PL/SQL. ¿ Used SAS Dataflux to implement Data Quality. ¿ Worked on Data mart and Data Warehouse.
Well experienced and skilled Java developer. Specializing in Distributed Framework such as Hadoop.
We specialize in using a variety of tools for Data Analysis, Statistical & Predictive modelling, Machine Learning and information visualization. Proficiency in Tools * Data Analysis - R, Python, SAS, SCALA * Machine Learning & Predictive Modelling - R, Scikit-Learn(Python) * Big Data - Hadoop, Hive and Apache Spark * Visualization - D3.js, Graphite/Cubism for time-series data * Database: Postgres, MySql, MongoDb,Cassandra For more details visit www.smokelift.com
Hi, This is Ramesh. I have done my masters in Computer Application. I started my career with SQL Server database 2000 administration and development. I am a certified Oracle Database Administrator Professional as well as Developer. I have an hands on Siebel CRM 7.7. Currently I am in Hadoop Administration and development and has built my Hadoop BIG Data Centre with multinode cluster of cross platforms using commodity hardware(Pentium 4 machine). Working on some use cases on the BIG DATA Analytic.
This is Karamjit Singh. I am a Software Engineer by profession and have good skills with extensive expertise in the design, development, and deployment of multi tier web(in Java/J2ee/Ruby/Rails/PHP), client-server applications, large scale, distributed, failover-safe systems like Hadoop/HBase/MapReduce, Amazon Web service , e-commerce application, Service Oriented Application, Enterprise application development and Integration, Lucene implementation in large scale application, designing of application architecture and code refactoring. My Technical Summary: Languages: Core Java, RUBY, PHP, SQL, PL/SQL. Technologies:J2EE, XML, XML Schema, AJAX, Web Services, Google Map, Twitter API, Jabber API Framework: Ruby on Rails, Struts, Spring, JSF. Database: MySQL, Oracle, DB2, Hadoop/Hbase/MapReduce Tools: Hibernate, AXIS.
Software engineer and research scientist with a Ph.D. in computational physics and 10 years experience in national labs, academia and the private sector. Core languages are Java and Java EE, MATLAB, C and R / SAS. Also proficient in C++, Python and Perl. Practical experience and formal training in machine learning / data science, computational physics, financial engineering and nanotechnology. Most important of all: I'm laser-focused on the bottom-line business value of the services I provide. I'm committed to software engineering best practices that yield faster results and better software. Specialties: Java, Java EE, MATLAB, R, SAS, C, C++, data science, machine learning, Big data, analytics, computational physics, computational chemistry, SQL, data visualization, data modeling, data analysis, optimization, materials science, scientific programming, linux, financial engineering, econometrics, derivatives, english, J2EE, agile, Hadoop, signal processing, image processin
We are not any web development or excel / tableau based Data Analytics Company. We solve real problems, problems where most tools simply fail. In BigStem we understand the Data and its for 4 V dimensions: Volume, Velocity, Variety, and Veracity. Volume: Collecting and Storing Gigs and Tera Bytes of data . Velocity: Time series real time explosion of Data from hundreds of sources. Variety: Comes in several forms and format: structured/ unstructured or encrypted and compressed. Veracity: Accurate ( with some percentile) in face of huge data. We are expert in Hadoop family and analytics mindsets. We are expert in Descriptive Analytics ( we provide flexible visuals & content aggregations using Statistical Tools and Models for Domain Experts to make decisions.). We create Simulation Analytics ( we create query models can tune simulations and generate results (like Logistic Regression, K-means or nearest neighbor model abstracted with query wrapper.)
Goal: Make Big Data & Cloud the Lifeblood of Enterprises and I want to ensure I'm at my best when coaching the next person who needs it.
+ Migrating current infrastructure to Hadoop / HBase or Cassandra + Providing ETL design and architecture on top of existing infrastructure using Hadoop + Establising Workflows on top of Big Data + Integrating latest state of the art Big Data Visualization technologies like Talend, d3. + Help in deciding real-time to near real-time differentiating of online Analytics + Crisis Handling - Disaster Recovery, Namenode troubleshooting, Data loss prevention and recovery on HDFS + Training in the area of Big Data Technologies