Java projects on time Perl projects on time I have a second for less than a dime
I am an Application Analyst, worked in variety of roles and provided solutions on EMC Documentum product suite. Currently looking for good job opportunities on Hadoop, MapReduce, EMC Documentum or Captiva software. Highlights of my Skills: Documentum xCP Documentum Composer DFS, WDK, DFC & DQL Captiva 7.0 InputAccel & Dispatcher Hadoop, MapReduce
This is Karamjit Singh. I am a Software Engineer by profession and have good skills with extensive expertise in the design, development, and deployment of multi tier web(in Java/J2ee/Ruby/Rails/PHP), client-server applications, large scale, distributed, failover-safe systems like Hadoop/HBase/MapReduce, Amazon Web service , e-commerce application, Service Oriented Application, Enterprise application development and Integration, Lucene implementation in large scale application, designing of application architecture and code refactoring. My Technical Summary: Languages: Core Java, RUBY, PHP, SQL, PL/SQL. Technologies:J2EE, XML, XML Schema, AJAX, Web Services, Google Map, Twitter API, Jabber API Framework: Ruby on Rails, Struts, Spring, JSF. Database: MySQL, Oracle, DB2, Hadoop/Hbase/MapReduce Tools: Hibernate, AXIS.
I am a seasoned IT professional looking for some side work that is interesting and in Big Data.
I am a researcher and software developer interested in machine learning, data mining and robotics. I have commercial experience with Big Data processing via Hadoop, HBase, Pig and using of machine learning with matlab, R and python.
Software developer with experience in large scale data processing, data management and testing.
Programming: Strong programming background in Java and C/C++. Amazon EC2 API. MapReduce Principle (Hadoop). Supercomputing and Big Data in the cloud. Hardware (close) programming in C and VHDL. Experiences: Working in a team project as a team leader with Ericsson - Australia Soft skills: Experiences working/leading diverse teams at University. Tolerant, respectful but focused. Disciplined Very good Student Awards as best student in final year project as a team leader with Ericsson. Double Degree (B.Sc Technische Informatik ( Information Engineering/Computer Engineering ) at Cologne University of Applied Sciences (Cologne, Germany) and B.Sc of Information Technology at Swinburne University of Technology (Melbourne, Australia).
14 years in software development. Expertise with following technologies - .NET, C#, ASP.NET - Java / J2EE with servlets and JSP only - BigData with Hadoop and related technologies - MapReduce and development experience with common patterns including Summarization, Joins, Sampling, Data Organization, Filtering etc - Pig - Hive - Oozie - Sqoop
Aajsoft, a Big Data solution provider serving the customers in USA, Japan and India. Provides technical training in Hadoop, HBase, Pig, and Hive. Specialize in MapReduce development using Java on Hadoop and HBase. Experience with different Hadoop stack like Apache, Cloudera, Hortonwork and MapR. Has experience in Oozie, Hue, Sqoop, and different NoSQL products. We have development experience in Cassandra and MongoDB also. At Aajsoft, our development team represents one of the leading resources for innovative and sophisticated web-based solutions using Java, and DotNet. We have designed many websites and web-based applications for companies of different sizes. We help clients to bring-in the right combination of strategy, experience, technology, design and development to every aspect of their Web-based projects.
6+ years work experience, including 5+ years in telecommunication industry 2+ years experience as research Engineer in telecommunication fingerprint team, Huawei Author of http://famousfollowers.appspot.com
I am working IT professional. I would like to work on interesting projects.Quality of work is guaranteed.
Clark Multimedia has 10+ years experience developing custom software solutions, web applications, and innovative web sites for our clients. We specialize in: * Web Design * eCommerce * Web application development * Custom software * Cross platform (Windows / Mac OS X / Linux) software * Web Spiders/Scrapers/Crawlers * Social media integration * Video game development * iPhone/Mobile applications * Java, C/C++, Python, VBscript, Haskel * Jsp/Servlets: Tomcat, GlassFish, Struts, Hibernate * MapReduce: Hadoop, Disco * Datamining * Artificial Intelligence / Recommendation Systems
We are a group of people passionate about evolution of Big Data Storage and Analytics. The areas we are strong at are: - Hadoop Setup - Map Reduce Programming - Storage and Analytics - HBASE Instead of focusing on Money, our team has set our goal as knowledge acquisition.
I have 7 years of IT experience serving some of the majors in Fortune 500. More than anything else, it's my passion to explore new Technologies, solve new problems and sharpen my Technical Skill. Let's work together to experience the metal.
I have 12+ years of experience in software development and design. My primary skills include C/C++ on linux/unix. I have been working on java technologies since last two years too. My work area include application programming to system programming. At present, I am working on big data analytics using hadoop technologies.
I am an experienced hadoop developer with good knowledge in Mapreduce and Hive . I have worked in a leading IT firm for 2 years as a hadoop developer. I am interested in doing Hadoop related projects.
Carrying 3.7 Year of Experience with variety of technologies like, Php(Magento, Joomla, Codeigniter), Ruby on Rails, Backbone js, Coffeescript, Hadoop, Pig, Mapreduce programming, Jquery, Bootstrap, css etc.
Language: Python, C, Shell Enjoy: Data Mining, Machine Learning, Recommendation System
We are a team of programmers with experience in leading technology and finance companies like Google and D E Shaw. We have vast experience in various technologies and have designed, built and implemented large scale applications. Our team also has industry experts in the areas of big data problems like crawling, analytics, map reduces etc.
My name is Zhao Jiajun, I am an graduate of Zhejiang university, major in software engineering.The past 4 years in Zhejiang University has provided me an excellent environment both for studying and living. I have got a good knowledge of professional skills in the field of Software Engineering and many friends. And I was honored with Merit Student and Third Scholarship. My greatest strengths are Java programming and web design. I keep myself improved in Java since the first year I come to ZJU as well as expanding my horizon to Web design and Project management and so on. In addition to my professional skills, I have good communicate skill and strong team spirit.
I'm a clojure hacker, with extensive distributed computing experience. I'm also well versed in Java. I prefer to work only on weekends and holidays.
I am Sai Krishna, finished my MS by research from IIIT Hyderbad. I have been working in the field of information retrieval and Information extraction for the past 6 years . I have been involved in building many large scale systems like web crawlers and web search engines for many languages. Some of achivements include building distributed search engine for XWiki.org as a part of my Google Summer of Code 2008. Conducted 15 days training session on Hadoop at CAIR, Bangalore, India. Lately, I have beening working on architecting and designing of a distributed application using HBase, Redis at the backend. I love exploring new open source projects/technologies. Have pursed my masters in the field of information retrieval I have a very good experience with Lucene and Nutch (for the past 6 years). Infact I have made many customizations to these projects and implemented several plugins. Also, I have been working on Apache SOLR for the past 2 years.
12+ Years of IT experience. Played various roles such as developer, technology lead, SOA architect, Lead BigData Architect etc on very large projects with fortune 500 customers. 3+ Years experience in Hadoop ecosystem. Deep technology understanding, architecture and design skills, hands on experience. Implemented data analytics on datasets ranging 10TB - 30 PB (Yes, I mean Petabytes) Installed hadoop clusters on EC2, Redhat, Centos on nodes ranging 10-2000 Innovative thinking, rich field experience, deep technology expertise and very creative.
I am a Cloudera Certified Administrator and Developer for Apache Hadoop (CCAH & CCDH), working with Big Data, and am seeking companies interested in efficiently and safely managing their rapidly-growing data. I perform my work for a company with a consultant attitude, whether I am a contractor or an employee. I consider the people I work with as my customers. This approach ensures I will always provide high-quality service. In addition, as I fulfill the needs of my customers, I strive to identify and fulfill needs previously unknown to them. This enables me to deliver a product or service that exceeds their expectations. I am very experienced in working with projects remotely. I enjoy automating complex and repetitive processes, to eliminate human error, and enable personnel to focus on higher-level responsibilities. Allow me to assist you in the success of your next project.
Hadoop and Big Data fan. If you search my name ( Jagat Singh + Hadoop ) on Google , you can find more about work i do ( or have done ) in hadoop world Having experience in Map Reduce , Pig , HIVE , Sqoop , Oozie , Storm , Mahout , Rhadoop , R , Cascading I hold the following certifications * Cloudera certified Hadoop Developer * Cloudera certified Hadoop Adminstrator * IBM certified Cloud solution provider * Sun Certified Java Programmer * ITIL certified for managing IT services I offer complete end to end solution design / architecture and deployment skills. I work for only one project at a time , so before confirming i would share with you what my workload is and when i can start for you. I am in Sydney timezone currently. Happy to talk further with you. Thanks for reading.
Hi, Please find attached details and resume; I am working In Capgemini Consulting India Pvt Ltd, Bangalore. I have 7+ years over all IT experience and 2+ years of experience in Hadoop ecosystem and remaining in java technologies. I would like to apply for the position with reference to your mail. Regards, Ravi Sankar Mobile:+91 --
Mirallen Systems, Inc. provides a full range of skills needed to direct and deliver a successful solution. We believe on quality work rather than quantity which meets and exceeds client expectations. We provide software services in the following domains: * Customized software development Java, Hadoop, Hive, Pig C#, VB.net, C++, WCF, WPF .net Remoting based applications. n-tier applications. * Web based applications ASP.net, Silverlight, Ajax, PHP based applications. Web Services (.net, php) * Back-end databases SQL Server, MySQL MS Access, sqlite etc. * Other SMS based applications Text extraction/parsing Complete Email solutions (SMTP, IMAP, POP3)
I have been working on big data technology since 2 years and I have worked both in Web and Desktop App developement using Java.
Hi there, Manoj Sahu basically from India Almost I hv 3.5 year exp. in IT sector. I did MBA IT.My Passion is IT only. Also I certified Redhat and MCITP. Now I am switching for Hadoop/Bigdata.- I am looking for the part time where I can grow my knowledge.
C/C++/x86 Assembly Objective C Java/Spring/AOP PHP/HTML5/AJAX/ JQuery MongoDB/MapReduce/NoSQL OpenSSL/PublicKey Cryptography Python/Lisp/Flex/Bison super computing clusters parallel computing environments
Software Engineer in .NET & Microsoft Technologies. I am involved in system implementation and solving complex business and logical problems using latest software technologies.
Technology consulting company focused on Big Data, Amazon Web Services and Social Media applications.
20 years of professional programming experience. Expert in Java and C++ on Mac, Windows and Linux. 10+ years of high scalability, server design and implementation experience. 2+ years of professional Hadoop and HBase development and maintenance in a production environment.
Android: 22 Projects, 15 frameworks, and 11 libraries so far. JEE/Web development: Lead developer in over 32 medium and 4 large scale projects with big data. Under strict NDA. For more info, please get in touch.
I have used more than 1 year experience for selenium and 2+years experience for java.I have experience for write automatic test script for regression in java code. Good at design test cases and manual testing too. Now work for American data analysis company on hadoop pig script and test mapred automatic. Other Experience: Configure the AWS and EMR programmatically and manual console. Good knowledge about SSH
12+ Years of IT consulting experience Played various roles such as developer, technical lead, SOA architect, lead big data architect 2+ years of experience in Hadoop ecosystem Implemented large data processing applications (10TB - 30 PB) Installed and managed Hadoop on EC2, Linux (10-2000 nodes) Hortonworks Certified Hadoop Developer Hortonworks Certified Hadoop Administrator Cloudera Certified Hadoop Developer Cloudera Certified Hadoop Administrator Winner of national level C programming contests
I am working as professional software developer for last 5 years. I have worked as full time employee for IBM and Adobe Systems. I am Oracle Certified Java Programmer.
Providing quality services in economical rates always remains a primary objective for us, as our competitive rates enhances in building and maintaining long-term business relations. We believe in retaining old customers and creating new customers through our captivating contributions in their business. Our wide array of services includes usage of technologies like PHP, PHP 5, CSS, CSS 3, Wordpress, CodeIgniter, Google Map API, Google Apps, HTML, HTML 5, .NET 4.0, SQL Server, Coral, CMS, Flash, Ajax, Content Management Services, Payment Gateways, Testing etc. The list awaits a lot more skills to be added into the forte.
A freelance consultancy for financial modelling /backtesting packages using Quant Strategies of Trading using latest Technologies catering to customer needs leveraging extensive experience and skillsets using R prog language/C++/C#/.NET/Java/RDBMS's/Big Data/Algorithms/Statistical Models in Financial Fixed Income,DerivativesTrading Domain.
Expert data technologist using Python to get the job done. My skills include data acquisition, wrangling, modeling, predicting, visualisation and other data related tasks. Pandas, NumPy, SciPy, IPython and Ggplot and other libraries are in my toolbox. Linux is my development platform. My working languages are English, Russian, Portuguese and Lithuanian
SLU Dev provides professional IT Management to global clients that range from small Start-Up companies to Fortune 500 firms. We are managed systems experts with over a decade of experience. Amazon Cloud, Rackspace, Google Compute and Microsoft Azure are all services we work with daily. We guarantee all our work. So there's never a risk working with us. Feel free to contact us to talk about your project before making any commitments. We specialize in Performance Tuning large cloud deployments, Cloud Monitoring, Cloud Growth strategies and Developer support.
We are a small team of highly energetic and professional people with various backgrounds, providing high quality services to our clients around the world.
Responsibilities - Design, develop, and optimize high-volume big data, analytics and business intelligence systems in an agile development environment. - Develop, maintain and improve existing and new data-related scripting, automation & processes - Track, verify and continually evolve data sources, data flows, tools, and storage mechanisms - Produce and maintain accurate, high-quality technical & system documentation - Contribute to the design, the architecture and implementation of a data-engineering infrastructure - Innovate novel solutions to address challenging technical problems Experience - Strong analytical and problem solving skills, particularly those that apply to a Big Data environment. - Hands on development experience on distributed systems (in sizing and building) large scale/high-performance solutions - Deep understanding and related experience with Apache Hadoop (Common, MapReduce, HDFS) and HBase. - Experience using/developing infrastructure solutions using the distr
Changer une entreprise par l'implémentation de nouvelles Idées et de nouvelles Stratégies Créer des applications Web qui amélioront les modes de vie.
Experience in Hadoop, MapReduce, Java, Amazon S3, Amazon AWS EC2, HDFS,Amazon EMR, Hive, Hbase, Cassandra, Pig, Sqoop, ZooKeeper, MangoDB. Writing MapReduce Programs to implemnet algorithms in the field of Machine Learning and Genomics. Configured Hadoop clusters on Amazon Cloud to handle BigData (1TB/run).
Aiming to become a Software Engineer, Software Developer, Responsive Web Developer, Database Administrator, Web Applications Developer, Security Engineer, Security Architect, Cloud Software Engineer, Cloud Services Developer, Cloud Systems Administrator, Cloud Systems Engineer Bachelor of Science in Computer Science, B.A. in Russian, UC Davis Expected graduation date: June 2014 Knowledge of backing up files without using any Windows utilies, and repairing Windows 7 OS by fixing corrupt partition table using bootrec.exe and bcdedit commands in MS-DOS Specialties: Ubuntu, Knoppix, FreeBSD, C, Visual C++, Visual Studio, Vi/Vim/Nano/eMacs/Sublime Text Editors, Turbo C, DOS, Excel 2010/Powerpoint 2010/Publisher 2010/Outlook 2010/Access 2010, XAMPP/LAMP, MySQL, Dropbox, Drupal 7 CMS, WordPress, Cascade Server CMS, Concrete5 CMS, Cygwin, PuTTY, WinSCP, WinRAR, VMWare, Filezilla Client, Plone 4 CMS, Syncback Pro, CSS 3, HTML 5
To bring together the knowledge of Networking, programming, Cloud and Open-source tools. Extensive knowledge of various networking protocols of TCP/IP. Worked extensively on Ubuntu, Backtrack, Fedora, CentOS. Implementing Map-Reduce framework in Java using Hadoop on multi-cluster nodes. Specialties: Network Security: Network intrusion and detection.Metasploit, nmap, netcat, HackerDefender, Rootkits Languages :C,Java, SQL, HTML, XML Scripting: Perl, Bash, Python. Other : Hadoop, Map-reduce, Hive, Pig, HBase, Apache Maven,Oozie, sqoop, Apache Ant, ETL process, Apache httpClient, JBOSS, tomcat, openTDSB Tools:Eclipse,Netbeans, Karmapshere Business Intelligence:MicroStrategy
Oracle Certified Developer In Java and J2EE Web Component Developer in Text Analysis,NLP, Python, NLTK Master degree holder Computer Sc. & Engg.
* With more than 10 years of experience in the software industry and expertise in many web and data mining technology such as Web 2.0, Java, spring framework, hadoop, hbase, lucene, solr, elasticsearch, katta, J2EE, XHTML, XML, groovy, machine learning(SVM, CRF, Maxent), server operating systems (Linux/windows), data mining, search engine... * Successfully track record of implementing the eXo Platform open source project and helping eXo Platform SARL company growth from 2 people to 50 employees. The eXo Platform SARL employs more than 100 collaborators world wide by 2009(Ref: http://wiki.exoplatform.com/xwiki/bin/view/Main/Project%20History).
The diverse web development experience, energy, creativity, and adaptability gained from 9 years as a Disney Senior Staff Engineer and 3 years of consulting makes me uniquely qualified to lead Java J2EE, data driven website, and retail commerce projects. I am a solid. experienced full stack Java/web systems developer. I have successfully completed eCommerce, marketing, front-end and backend development, search optimization and enhancement, database and API services projects. As a full-stack web Java/J2EE developer/architect with web services, traditional and mobile sites experience, I believe in focusing on a project, finding unique and efficient methods to achieve critical goals. I take risks where necessary, but not at the cost of being reckless; am a quick learner and excited by opportunities to master new and cutting edge technologies. Currently I am focusing on leveraging the Amazon Web Services cloud and Hadoop/Big Data to deliver timely solutions to complex problems.
We are providing solution to small business for application development , deployment and support. Contact us for Hadoop (Big Data) Consultancy, Training, POC / Project Development / Deployment / Troubleshooting Specialties: Big Data, Hadoop, MapReduce, Hive, HBase, Flume, Pig, Hue, Greenplum, Cloud Computing, Java
I'm a software engineer passionate in programming, algorithms, big data, machine learning, services administration. during my career as a software engineer, i had the chance of getting immersed in this technologies and as a consequence I was able to build up a solid knowledge that help my current company (Atigeo) to increase its revenue. easy learner I'm always interested in learning new things and play with new technologies. For the last 2 years I had the chance to be in touch with the latest big data technologies (hadoop, cassandra, solrCloud, hive etc...). I'm looking forward for challenging projects and to improve my skills.
- Master's degree in Machine Learning, with application in Bioinformatics. - PhD in Statistical Machine Learning, with applications in recommendation system, personalization, and social network marketing, sponsored by both industry and the government. - Highly experienced in R programming, and familiar with numerous R packages for efficiently handling of large data sets, parallel/cluster computing, statistical analysis, and visualization; Using both open-source R and Revolution R. - Working knowledge of Oracle, MySQL, SQLite databases. - Working knowledge of Hadoop cluster computing, on both private cloud infrastructure (with nodes running Ubuntu/CentOS) and Amazon Web Services, including EC2, EMR (Elastic MapReduce), and S3. - Knowledge and experience in text mining and NLP. - Project management with Microsoft Project and Atlassian JIRA. - Source version control with Git, SVN, and CVS, deployed on private infrastructure and Github/BitBucket.
Over the last 10 years I have built scalable ad networks using Linux, SAS, python, and C++ using information retrieval and machine learning algos and deployed on AWS and Datapipe. I architect scalable scrapers in python using aws ec2 and MapReduce, scalable time series forecasting, and machine learning algos for real time decision making for online solutions. I also optimize SEM using bing, yahoo, and Adwords API's with integrated google analytics api support. My corporate roles are at the CTO, Chief Scientist and Chief Architect level. I currently use the below development stack to maintain my own highly available, low latency and high throughput ad server and real time bidding engine (RTB) for a Demand Side Platform (DSP) that can currently 8,000+ bids per second with a sub 100ms response. My stack is nginx, uWSGI with the gevnet loop,python, Redis, Riak, Druid,Hadoop, and MySQL. I am currently integrated with Google Adx and am approved third party advertiser for AdX RTB.
Certified for Hadoop from Cloudera. Core skills: a) Cloud Computing. b) Hadoop, HBase, Map/reduce c) CoudhDB, MongoDB, Redis, Memcache d) Jave-J2ee, ROR(ruby on rails)
? 6+ years of progressive experience in design and development of web based and Client Server applications. ? Significant 3 Years of experience in designing and developing applications using Ruby On Rails technologies and 2+ years of experience in Perl, Perl/CGI, XML, XSL. ? Have a Vision to quickly identify and understand the business benefits of new technologies. ? Strong computational, analytical problem solving, and design skills. ? Highly organized and dedicated with positive attitude.
I have been delivering C, C++, Java and shell script based projects for 10 years, specialising in data processing and analysis, network protocols and desktop applications.
I have more 10 years of experience in developing the server side products using C,C++ and Java.
An outstanding consulting professional with strong business, analytical and problem solving skills acquired through experience in information technology, insurance domain and academic learning. Overall 5 years of consulting experience in analysing business requirements, providing solutions, customer interaction, delivering business transformation programmes and implementing IT solutions and systems. Master of Business Administration from University of Liverpool, United Kingdom.
Specialties Enterprise Linux, Hadoop
Solid experience of more than 5 years with all Intelligent Networks related elements. Experienced with supervision and performance of operations related to systems analysis, administration, and maintenance for network machines. Customer oriented team player with strong leadership, communication and decision making skills. Demonstrated ability to resolve highly complex system level issues, recommend and implement standards and procedures, and launch network related projects. My objective is to find any career advancement , which gives me a space to diversify my technical experience in telco company
My primary areas of involvement are in object-oriented development, refactoring, patterns, agile methods, enterprise application architecture, domain modeling, and extreme programming. I have strong leading and technical management skills. Always follow the advanced styles of development and always try to be on the peak of technology. I have participated in all phases of the enterprise system development including estimate, design and architecture, implementation, debugging, deployment, integration, testing, re-factoring and maintenance with producing concise technical documentation. I possess good knowledge and experience of web enterprise systems together with in-depth understanding of such access layers as presentation, web, business and data with their inter and intra communications. Identifying Problems. I am apt to understand how things work and insightfully recognize ways to make them better. I see angles that others do not. Creating Strategies. I have a big-picture orientati
Solution for hadoop, java, j2ee, hadoop hdfs and mapreduce and hadoop clustering.
¿ Having around 4.1 years of IT Experience ¿ Around 2 Years of experience in Hadoop Framework development ¿ Over 2.2 Years of experience in Data warehousing applications (OBIEE (Siebel Analytics)/Informatica),Microstrategy.
Expertise in data mining, machine learning, sentiment analysis, deception detection, information visualization, big data technologies (MapReduce, Hadoop, Spark, Pig, Hive), NoSQL datastores (HBase, Cassandra, MongoDB, Neo4j), scientific and statistical languages (Matlab, R, SAS JMP)
Internet executive with over 14 years of Software Development experience, including creating and general management of mid size organizations, corporate development, product development, business operations and strategy Currently General manager and CTO at Gemicle, software services and solutions provider specialized in social networks applications, game, ecommerce, b2b solutions development, outsourcing and own startups elaboration. Prior to Gemicle, was an Branch Director at SysIQ inc, Project Manager and Team Lead at Orneon, Aricent, KCK. Focused on: - Company management and administration - Project Leadership - Strategic Planning - Team Leadership and coaching - Relationship Management - Budgets - Negotiations - Process Improvement - Software development and testing - Software architecture Have a solid Java experience: Spring, Spring MVC, Guice, Hibernate, J2EE, EJB, JSP, Servlets, Spring Security(Acegi), JMX, Netty, Hadoop, MapReduce, Cascading, Junit, TestNG, Maven, An
Experience Summary : ¿ 6 years of experience in analysis, design, coding, testing and support in Information Technology industry ¿ Experience with Hadoop ecosystems, MapReduce, HDFS, HBase and configurations and installations ¿ Good knowledge of treasury operations covering front, middle and back office functionalities in a bank Software Skills ¿ Operating System : MS Windows, UNIX, Linux - Ubuntu, VMWare ecosystem ¿ Languages : Hadoop Map Reduce, PIG Latin, HIVEQL, C, C++, Oracle PL/SQL, Java SE ¿ Database : MS SQL-Server, Oracle 8i/9i, TOAD, HBase ¿ Front End Tools : Microsoft Visual Basic(VB)
4 plus years of experience as Systems Analyst and Software Engineer with extensive experience performing requirement engineering, software analysis & design, project management, solid knowledge of Java EE, Weblogic application server and Oracle database, Oracle Developer Suit 10g (Oracle Form & Report Builder). Possess Graduate Diploma in System Analysis, Bachelor of Engineering (Mechatronics). Possess certification of Building Enterprise Application Using JEE, certification of Secure Services Development in Java EE, Dominate the field of Software Engineering by continually building upon current skills and abilities. Strong knowledge in SDLC and Project Management Principles. Knowledge in Software Development Methodologies, such as OOAD, iSDAT and exposure to Amazon Web Services (S3, MapReduce, Elastic Beanstalks), Hadoop and Cloud computing, iOS 5.0 Mobile App Development and knowledge in Service Innovation,Values of Co-creation/Co-production,Six Thinking Hats
i working as Hadoop Developer in a Company. My work related mostly on JAVA, Mapreduce, HIVE, Pig,Sqoop My very Special in Doing works in Android, Designing, Photoshop, JAVA, HTML kind projects.
Looking for projects using interesting technologies. I've just started getting into Play and Scala.
I graduated Columbia University with a bachelors in Computer Science in 2006. After graduation, I spent 4 years as a software engineer at Google. At Google, I worked on mobile web crawling, network traffic monitoring/querying (mostly large scale data warehousing and quering, NOSQL etc.), and ZXing, the QR code reader. Most of my work at Google was in C++. I recently left Google to try to go it alone. Since leaving over two years ago now, I have immersed myself in all of the latest mobile and web development. Through working on my on consumer web startup, I have gained solid experience in Ruby on Rails, iOS, and Android development. I know what it takes to get a large project done on schedule and with a high quality end product. I have helped out on many successful projects. I specialize in developing and deploying large-scale and high-performance websites hosted on various cloud platforms such as Heroku or EC2 (using Opscode Chef), optionally along with a mobile app.
Objective: To fully utilize my Computer Engineering educational experience, my Production work experience and programming abilities to manage and maintain large cloud based systems and assist companies and entrepreneurs to achieve their business goals efficiently and cost conscious. I have spent 20 years in IT and have a wide range of skills from programming, software installation, software maintenance, production system monitoring/management, project management and problem resolution. I am currently working to be a cloud engineer in areas such as Amazon Web Services, Hadoop Map/Reduce, HBase, and related technologies in provisioning, maintaining and monitoring cloud computing clusters. I have a proficiency in many computer languages including object oriented languages. I would appreciate an opportunity to work on some of your technical challenges to help your company move forward quickly.
Distributed computing programing using Hadoop MapReduce using Java programing language Experience of data mining, predictive modeling, and Statistical computing tools such as IBM SPSS, SPM CART, WEKA, XLMiner, and R Familiarity with data preparation and pre-processing methods in the field of Data Mining. Good interpersonal skills coupled with proven experience communicating to business and technology stakeholders. Knowledge of Rational Unified Process framework for iterative software development. Working technical experience with relational database using Oracle, SQL, DB2 and DataCom. Expert of Object Oriented Programming language concepts and terminologies.
I am always looking for opportunities to expand my skills and take part in projects that present a challenge. Over the last 4 years, I have worked in different projects, many requiring not only software craftsmanship (OOP design, agile development, TDD) but also a mindset typically needed in research. I love solving problems that require POC type of work and experimenting. I am an advocate of open source solutions, and prefer to work in places that have a startup culture. My competencies lie mainly in software development (Java, Python, some C++), but I do not hesitate to work on tasks typically meant for DevOps, DB admins or QAs. Being versatile is one of my goals. I also have experience leading small, virtual teams of programmers. Chief areas of interest: - Big Data and no-SQL technologies (Hadoop, Hive, MongoDB, glad to learn others), - AI (soft and hard computing techniques), - dynamic programming languages (Python, Lisp).
I have 15+ years of experience with data. More recently I was at Yahoo for 6 years working on Hadoop/Pig with big data. I can help you with all your data needs: - Analytics - Validation and testing - Migration - Modeling - Visualization
I have a java experience of 2.2 years with a relevant hadoop mapreduce experience of 1.5 years. I have worked on a project which makes use of the intenational design for direct batch processing of unsrtuctured data and it is a huge succes.
15+ years of experience in technical leadership, designing and developing software in the field of object-oriented, distributed, enterprise and Java based technologies. Job responsibilities incorporate design and architecture, implementation, interaction with clients and customers for resolving product issues as well as mentoring the team. Successfully achieved traction and sign-offs across team boundaries having disparate requirements for baselines segmented projects. Led critical and complex projects to successful completion and transition to customers. Ability to foster trust and candidness within the team and relationships beyond for a pleasant and appealing environment. Proficient in design, architecture and JAVA/J2EE/Object-Oriented Methodologies and notation ( UML ), Design Patterns and associated Server and RDBMS Technologies. Proficient in Iterative and Incremental lifecycle. US Citizen - no sponsorship required. Blog: http://khanna111.com/wo
* Over 4 years of experience in software development * Expert in Hadoop Mapreduce and cloud computing * Skills include Java-J2EE, SQL * Well versed in SDLC tools and processes
My self Premsagar Gadapa started IT career in july 2011, Having 2.5 yrs of IT Experience in Software development. I have been working on different technologies like Bigadata Hadoop, Cloud computing and Ab Initio. Mainly am looking forword to work on Hadoop, i have good knowledge and hands on experience of Hadoop Ecosystem. Having experience of Hadoop installation and Configuration.Good Understanding and Hands on experience of Hadoop stack (Mapreduce,Pig,Hive,Sqoop and Flume).Had done quite a few POC's and Projects. Have Good knowledge of Cloud Computing(EC2 and S3).
I am a Computer Engineering graduate from Nepal. I have two plus years of professional experience on programming in java. I have also worked on hadoop distributed system with Mapreduce framework.
2 years experience in Internet developing(java).2 years experience in java ee development,3 years experience in crawler/spider development.2 years experience in search engine(lucene) development, 1 year experience in hadoop experience includes map/reduce programming,hdfs,hive,hbase,etc.
Computer Science is my study , i have Deep knowledge in object oriented programming paradigm (OOPA), data structure ,Web Services Architecture (WSA) , basic of algorithms analysis and artificial intelligence ,compiler design , image processing with Matlab ,operating systems concepts, distributed systems concepts software Engineer Good knowledge of agile technique which i use in some application i participate in developing it believe in open source , use Linux Ubuntu ,and develop app using java with eclipse and android apps too . i have some experience in c++ , QT , python , Django framework ,RedHat Administration , Facebook , flicker and twitter API big data and map-reduce model is currently my interesting field, and work with id3 and it's successors c4.5 algorithm for classification an and generate decision tree of training data in my graduate project (handwriting OCR) i think it doesn't matter what you know , but what you can know and what you can add to working team S