I have 11 years of experience as IT professional. I have 1+ year of experience in developing mapreduce programs.
I work as a Technical Consultant on java based projects and ECM software. Also have good skills to work with Android programming, Vaadin framework, Spring and Hibernate applications, Hadoop, MapReduce, and Captiva software. I'm here to find work on any java based projects.
This is Karamjit Singh. I am a Software Engineer by profession and have good skills with extensive expertise in the design, development, and deployment of multi tier web(in Java/J2ee/Ruby/Rails/PHP), client-server applications, large scale, distributed, failover-safe systems like Hadoop/HBase/MapReduce, Amazon Web service , e-commerce application, Service Oriented Application, Enterprise application development and Integration, Lucene implementation in large scale application, designing of application architecture and code refactoring. My Technical Summary: Languages: Core Java, RUBY, PHP, SQL, PL/SQL. Technologies:J2EE, XML, XML Schema, AJAX, Web Services, Google Map, Twitter API, Jabber API Framework: Ruby on Rails, Struts, Spring, JSF. Database: MySQL, Oracle, DB2, Hadoop/Hbase/MapReduce Tools: Hibernate, AXIS.
Having 15+ years of experience working with reputed multinational companies in India and US. Involved in design/developing /troubleshooting large corporate applications. My expertise is in Cluster configuration, Hadoop, Hyper V , Storage Management, Storage visualization. Also worked on eCommerce web portal development , performance optimization.
I am a Cloudera certified data engineer - looking for interesting work on the side.
I am skilled with latest BigData technology Apache Hadoop and its ecosystems like Pig, Hive, Hbase, Zookeeper etc. I can help setup Apache Hadoop in standalone mode, pseudo distributed mode and fully distributed mode. I can even help you administer Apache Hadoop for efficient use of resources.
i have good experience in mapreduce,hive,pig,sqoop,python,java,linux.
10+yr experience in the field of IT and bigdata technologies like hadoop, mapreduce, sqoop, hive and also experienced in various analytical tools like R, Pentaho, datameer, tableau, etc
Majoring in Computer Science and Engineering, with a keen interests and added expertise in getting insights from Big data. Currently a senior year undergraduate at IIT Jodhpur. Practice Lead - Big Data, Hadoop , Apache Storm,Spark, live streaming and batch streaming Areas of expertise include: ~ Hadoop Administration , mapreduce ~ Real time analytics : Spark, Storm ~ NoSQL :- Hbase, Mongodb ,Cassandra ~ Big Data/Hadoop solution Architecture, Design & Development ~ Social Media, Community & Viral Marketing ~ Big Data strategy
We are very Enthusiastic in Bigdata Technologies like Hadoop, Mapreduce, hive , HBase, Sqoop, Oozie, Scala, and No SQl Data Base Like MondoDB Etc., We are Proficient in Java and Can Develop Complex MapReduce Applications As per the Client's Requirement to Handle Large Volumes of Data Sets. Capable to Setting up Cluster for Parallel Distibuted Computing in Hadoop and Can Install all the Required softwares for setup. In one Word we can Handle from scratch to Product including setting up Cluster, Installation of Components in Hadoop EcoSystems and Developing Mapreduce, Hive related Applications in Integration with Components like bHBase,Sqoop and OOzieetc.
Aajsoft, a Big Data solution provider serving the customers in USA, Japan and India. Provides technical training in Hadoop, HBase, Pig, and Hive. Specialize in MapReduce development using Java on Hadoop and HBase. Experience with different Hadoop stack like Apache, Cloudera, Hortonwork and MapR. Has experience in Oozie, Hue, Sqoop, and different NoSQL products. We have development experience in Cassandra and MongoDB also. At Aajsoft, our development team represents one of the leading resources for innovative and sophisticated web-based solutions using Java, and DotNet. We have designed many websites and web-based applications for companies of different sizes. We help clients to bring-in the right combination of strategy, experience, technology, design and development to every aspect of their Web-based projects.
14 years in software development. Expertise with following technologies - .NET, C#, ASP.NET - Java / J2EE with servlets and JSP only - BigData with Hadoop and related technologies - MapReduce and development experience with common patterns including Summarization, Joins, Sampling, Data Organization, Filtering etc - Pig - Hive - Oozie - Sqoop
I have 10+ years of experience in designing and developing enterprise solutions in Big Data and Mobile technologies. I am certified Big Data developer/Consultant with additional certifications on Mobile application development(Android/iOS) and Java Webservices. I work in SME pool for hadoop technologies and have deployed clusters in AWS, Rackspce and within private client network. I have also developed Analytics models and statistical regressions in Mapreduce and Spark for data processing. I have setup multiple big data clusters in Amazon and in-house using Cloudera and Hortenworks distribution . I also have written blogs and contents about Big data. I also have 35 android apps published in google play store and amazon app center and many iOS apps created for different clients. More details about me and my white-papers and implementations can be found at http://ngvtech.in
Hadoop Cluster Setup and Admnistration of various Distributions Cloudera,Hortonworks and MapR.Development of ETL workflows using Hadoop Ecosystem componenets.
I have worked for several years on Java. I have completed 2 Internships in which I worked on C# and Java respectively. I have recently completed my Dual Degree course at IIT Bombay. I have experience in developing Map-Reduce codes in Hadoop Framework. I know a lot of Machine Learning Techniques and tools.
I have been doing programming for the past 17+years in various programming languages starting from C, C++, Visual C++. I have around 11+ years strong experience in .NET stack including ASP.NET, ASP.NET MVC, SQL Server, WPF, WCF. Recently for the past couple of years, i have been extensive working in Node.js, MongoDB and AngualarJS. I can provide high quality software for the clients.
Experience of working in big data analytics. Involved in bringup, tuning, customization of petabyte scale big data processing ecosystem (involving open source technologies Flume, Hadoop, Hive, HBase, Storm and MongoDb). Experience in development of big data processing applications, solving problems in online advertising domain for purpose of Reporting, Analytics and MachineLearning. Expertise in open source big data technologies(batch processing/real time).Knowledge and experience of big data processing ecosystem components for workflow management, distributed data collection, streaming data processing, noSQL stores. Worked on distributed storage systems. Specialties: Software design, Distributed Data Processing(Hadoop, HBase,HIVE,MongoDB,Redis, Flume),Streaming data processing(Storm), Map Reduce way of data processing, Distributed storage systems, Embedded systems, Linux device drivers and kernel programming
11 years of exp on application development.
I am Sai Krishna, finished my MS by research from IIIT Hyderbad. I have been working in the field of information retrieval and Information extraction for the past 6 years . I have been involved in building many large scale systems like web crawlers and web search engines for many languages. Some of achivements include building distributed search engine for XWiki.org as a part of my Google Summer of Code 2008. Conducted 15 days training session on Hadoop at CAIR, Bangalore, India. Lately, I have beening working on architecting and designing of a distributed application using HBase, Redis at the backend. I love exploring new open source projects/technologies. Have pursed my masters in the field of information retrieval I have a very good experience with Lucene and Nutch (for the past 6 years). Infact I have made many customizations to these projects and implemented several plugins. Also, I have been working on Apache SOLR for the past 2 years.
Very quick individual coder and fast learner. Very High Coding Standards. Have 3 years of total work experience which includes working with Amazon, Housing.com, Elitecore and Citrix.
Hi, I have completed Post Graduation in Master of Computer Applications.
Experienced Java backend developer. Also I have experience with Java frontend frameworks(such as Tapestry, ZKoss) Databases: MySQL, Hadoop, MongoDB, Redis, Couchbase. Test frameworks: JUnit, Mockito, Robot. Have a big experience with Hadoop stack (Pig, Hive, Impala, Spark, Flume, Kafka, HDFS, MapReduce, HBase)
I am a Big Data Expert having knowledge of Hadoop, Hbase, Flume, Hive, Java, MapReduce, Pig, Zookeeper etc. I have worked on projects of Big Data using Hive, Mapreduce, Pig and Flume. I have done 3 projects of datamining for a U.s. based client. using weka 3.6
More than 8 yrs of very expertise level of experience in Informatica Development. Having complete knowledge of Informatica 9v
Working as Software Developer from past 6.2 years. Currently working with a Leading IT Services Company. Keeping Keen interest in developing Applications, Product development. Working as Hadoop Developer which comprises of MapReduce, Pig, Hive. Have Worked on database development projects using Oracle, MySQL, Java. Working on NoSQL Database - Cassandra
I'm a clojure hacker, with extensive distributed computing experience. I'm also well versed in Java. I prefer to work only on weekends and holidays.
Software Developer with 3+ years of experience in core programming languages including c, c++ and java. Also possess Masters in Software Engineering. Aim to develop quality software solutions right from scratch. Also interested in Mobile Games and App Development for Android and IOS. Recently working on big data technologies to provide solutions at industry level and to contribute in the research and development of big data applications.
? 9+ years of experience with strong emphasis on Design, Development, Implementation, Testing and Deployment of applications. ? More than 2.5 years of experience in Hadoop ecosystem & Big-Data Analytics. ? Expertise in core Hadoop and Hadoop technology stack which includes HDFS, Map Reduce, Pig, Hive, Sqoop, HBase, Oozie, Storm, Flume, Kafka, Spark and Zookeeper. ? Hands on experience on Hadoop ecosystem components like Hadoop Map Reduce, HDFS, Pig, Hive, Sqoop, HBase, Storm, Flume and Kafka.
6+ years work experience, including 5+ years in telecommunication industry 2+ years experience as research Engineer in telecommunication fingerprint team, Huawei Author of http://famousfollowers.appspot.com
I have knowledge in Hadoop and Core Java and currently working as a Hadoop Trainer. I am looking for more and new projects related on hadoop to work on and provide good service to my clients at a low cost and submit the job on time.
Specializing in compilers, programming languages, static analysis, and parallelization
I am having 1+ Year of experience as Hadoop Developer and Admin. Having excellent programming and logic power in Big data analysis and managing unstructured data. Big boost knowledge in PIG, HIVE, MapRecude, HBASE , Sqoop,Spark, Java, Unix Scripting
Providing quick and reliable solutions to hadoop related (Admin,development and Architect) problem ..
Over 13 years of Developer experience and 5 years project and team management. Design and develop high available, scalable web applications.
Clark Multimedia has 10+ years experience developing custom software solutions, web applications, and innovative web sites for our clients. We specialize in: * Web Design * eCommerce * Web application development * Custom software * Cross platform (Windows / Mac OS X / Linux) software * Web Spiders/Scrapers/Crawlers * Social media integration * Video game development * iPhone/Mobile applications * Java, C/C++, Python, VBscript, Haskel * Jsp/Servlets: Tomcat, GlassFish, Struts, Hibernate * MapReduce: Hadoop, Disco * Datamining * Artificial Intelligence / Recommendation Systems
I have 7 years of IT experience serving some of the majors in Fortune 500. More than anything else, it's my passion to explore new Technologies, solve new problems and sharpen my Technical Skill. Let's work together to experience the metal.
Expert in Datawarehousing solutions
Extensive 5 years of experience in IT, Total 2.4 years of Experience in Big Data & Cloud computing on Amazon. Migrated Projects such as Cobol Programs running on Mainframe , DB2 queries , Oracle Queries and Sql long running queries into Hadoop distributed Environment and so many POCs .
I have 14+ years of experience in software development and design. My primary skills include C/C++ on linux/unix. I have been working on java technologies since last two years too. My work area include application programming to system programming. At present, I am working on big data analytics using hadoop technologies.
I am working IT professional. I would like to work on interesting projects.Quality of work is guaranteed.
10+ years of experience in building scalable, fault tolerant, high throughput systems. Expert programming skills with C, C++, Java and Python and strong experience in Unix/Linux OS Experience in relational databases and noSQL databases like HBase, Cassandra Experience in bigdata technologies like Hadoop, MapReduce, Hive, Pig and Sqoop. Experience in cloud computing tools like Xen, KVM, EC2, OpenStack, LXC, Docker
More than 6 years of experience in overall software development, and more than 3 years of experience in Big Data technologies and Recommender Systems algorithms and implementation. Proven track record of innovation and having patents.
Specialties: Machine Learning, Mobile Development, Big Data, Backend Programming Experience : Storage and Processing : Hadoop, HBase, MapReduce, Cassandra, MongoDB, Relational DB's Graph Databases and processing : Titan, Neo4J, Giraph Text Extraction and NLP : Have developed a parser combinator from scratch, Gate, OpenNLP, Stanford NLP Programming Languages : Java, Scala, Python Backend Platforms : Spring MVC, Django, Flask, Dropwizard, Play Machine Learning : Have developed custom implementations of algorithms, Mahout, Weka Pipelines : Apache Kafka, Apache Flume
I am a Big data Architect, Having 10 yrs of experience implemented multiple data pipelines and complex analtyic solutions using Hadoop Ecosystem ( MR , Yarn,Hive, Pig, Sqoop , Flume) , Nosql solutions like ( Hbase , cAssandra, Couchbase). Implemented real time pipeline using Storm , Kafka. Implementd DW solutions using Spark , Saprk SQL. Extensive knowledge on AWS and Amazon Redshift. Cloudera Certified Developer for Apache Hadoop Cloudera Certified Specialist in Apache HBase Cloudera Certified Administrator for Apache Hadoop. Specialties Big Data, Hadoop, Spark ,Storm , Kafka , Couchbase , Cassandra, HDFS, MapReduce, Pig, Hive, HBase, Sqoop,Crunch,Mahaout Java, JSP, Servlet, Spring, Hibernate, SQL, Ant, Subversion,ExtJS,Android
Data scientist with strong programming and academic background, interested in data mining using machine learning and other AI techniques. I've studied machine learning and data analysis online courses from leading universities (Stanford, Duke, Caltech) and trained on practical projects from Kaggle with a score in the top 10% of involved data scientists. I've recommended missing links in a social network (Facebook), predicted the salary of job ad based on its contents (Adzuna), auction price (FastIron), built a web pages classifier (StumbleUpon). As a regular job, I predicted user's profile from the click stream for advertisement targeting, and age and diseases from biometric data. Specialties: data mining, machine learning, natural language processing Primary languages: C, python, R Secondary: perl, bash, Matlab, C++11 Experience: FreeBSD, linux; AWS; mySQL; Berkeley DB, Redis; numpy, scipy, sklearn, Hadoop, MapReduce Algorithms: general, machine learning, AI
12+ Years of IT experience. Played various roles such as developer, technology lead, SOA architect, Lead BigData Architect etc on very large projects with fortune 500 customers. 3+ Years experience in Hadoop ecosystem. Deep technology understanding, architecture and design skills, hands on experience. Implemented data analytics on datasets ranging 10TB - 30 PB (Yes, I mean Petabytes) Installed hadoop clusters on EC2, Redhat, Centos on nodes ranging 10-2000 Innovative thinking, rich field experience, deep technology expertise and very creative.
We provide solutions for big data problems using Apache Hadoop, Apache Spark, Hbase, Storm, Hive, Spark SQL, Kafka, Oozie, Cassandra, Solr, Lucene, Lily and other tools in Hadoop echo system. We also specialise in machine learning using Mahout, Mlib, GraphX and R.
Hi, Over 15 years of IT experience into Technology, Operations, Delivery and Project / Program management. Led the RFPs for a pioneer Telecom upgrade using Oracle Golden Gate. A seasoned professional who will deliver to the commitments with quality. Thank you.
With an M.S in computer science and more than 17 years of experience in the IT industry managing products & bootstrapping, mentoring and managing high performance teams. I possess a unique blend of Technical prowess, Product planning & development experience, Deep domain expertise and an insatiable urge for learning new things, which have enabled me in building and developing complex solutions that meet specific needs of my customers.
14 years of experience developing and implementing machine learning algorithms, predictive analytics and text mining using R, Python, Numpy, SKLearn, NLTK, SAS, and Hadoop technologies (Hive, Pig & Spark). Expertise mining terabytes of structured and unstructured data to gain key consumer and brand insights. Expertise in Classification, Recommender Systems, Support Vector Machine, Random Forest, Boosting Models, Bagging Models, Random Forest, Neural Networks, Logistic & Linear Regression, Clustering, Bayesian Statistics, Natural Language Processing, Text Mining & Sentiment Analysis. Expertise in Transactional Data, CRM Data, Web Data, Survey Data, Demographic Data, and Spatial Data.
SQL & NoSQL Data Architect | ITILÂ® & PRINCE2Â® Practitioner | *** Open Network for Database Professionals *** http://www.linkedin.com/in/devendrashirbad -- Devendra is known as ÂDevÂ among the group. -- He is an individual contributor, highly motivated and results-driven IT Professional with ~12 Years of experience. -- His technical skills are certified by Top Blue Chip IT companies like Microsoft, HP, IBM, Oracle & MongoDB etc. -- He follows AXELOS Global Best Practices for IT & Project Management. He is a certified ITILÂ® & PRINCE2Â® Practitioner too. -- He is Ex-Microsoft CSS (Customer Service and Support) for SQL Server 3rd Tier and has specialization in SQL Server Replication, Performance Troubleshooting and Data Recovery Critical Situations (CritSit) on PROD servers. -- He has substantial experience in Database Architecting, Database Administration, High Availability & Disaster Recovery (HA & DR) Solutions and has nice exposure to Data Warehousing.
I am a Cloudera Certified Administrator and Developer for Apache Hadoop (CCAH & CCDH), working with Big Data, and am seeking companies interested in efficiently and safely managing their rapidly-growing data. I perform my work for a company with a consultant attitude, whether I am a contractor or an employee. I consider the people I work with as my customers. This approach ensures I will always provide high-quality service. In addition, as I fulfill the needs of my customers, I strive to identify and fulfill needs previously unknown to them. This enables me to deliver a product or service that exceeds their expectations. I am very experienced in working with projects remotely. I enjoy automating complex and repetitive processes, to eliminate human error, and enable personnel to focus on higher-level responsibilities. Allow me to assist you in the success of your next project.
DataHolics develops enterprise class Big Data solutions that bring visibility to a treasure trove of data already flowing through the organization.
Hadoop and Big Data fan. I am working as Bigdata Architect. I offer end to end Bigdata integration skills. My linkedin profile is http://au.linkedin.com/in/jagatsingh If you search my name ( Jagat Singh + Hadoop ) on Google , you can find more about work i do ( or have done ) in hadoop world Having experience in Map Reduce , Pig , HIVE , Sqoop , Oozie , Storm , Mahout , Rhadoop , R , Cascading I hold the following certifications * Cloudera certified Hadoop Developer * Cloudera certified Hadoop Adminstrator * IBM certified Cloud solution provider * Sun Certified Java Programmer * ITIL certified for managing IT services I offer complete end to end solution design / architecture and deployment skills. I work for only one project at a time , so before confirming i would share with you what my workload is and when i can start for you. I am in Melbourne timezone currently. Happy to talk further with you. Thanks for reading.
Mirallen Systems, Inc. provides a full range of skills needed to direct and deliver a successful solution. We believe on quality work rather than quantity which meets and exceeds client expectations. We provide software services in the following domains: * Customized software development Java, Hadoop, Hive, Pig C#, VB.net, C++, WCF, WPF .net Remoting based applications. n-tier applications. * Web based applications ASP.net, Silverlight, Ajax, PHP based applications. Web Services (.net, php) * Back-end databases SQL Server, MySQL MS Access, sqlite etc. * Other SMS based applications Text extraction/parsing Complete Email solutions (SMTP, IMAP, POP3)
Data Mining and Machine Learning Engineer. PhD. Have experience with Hadoop, Hive, Mahout, R, Weka, development experience with Java.
freelancer based out of Bangalore,India with 10 years of overall experience in IT I have been designing and implementing solutions and products using big data ecosystem(Hadoop,Mahout,HBase,R,Machine Learning,Cloudera) over three years .Please message if you require further information.
I have used more than 1 year experience for selenium and 2+years experience for java.I have experience for write automatic test script for regression in java code. Good at design test cases and manual testing too. Now work for American data analysis company on hadoop pig script and test mapred automatic. Other Experience: Configure the AWS and EMR programmatically and manual console. Good knowledge about SSH
I am a Hadoop-BigData Developer
We offer following services - website design ecommerce store setup payment gateway setup custom programming
I am working as professional software developer for last 7 years. I have worked as full time employee for IBM and Adobe Systems. I am Oracle Certified Java Programmer.
Creative problem solving using appropriate technology and tools.
Data scientist using programing and machine learning algorithms to get the job done. My skills include data acquisition, wrangling, modeling, predicting, visualisation, machine learning, and other data related tasks. Pandas, NumPy, SciPy, and Ggplot, MATLAB/Octave and other libraries are in my toolbox. My working languages are English, Russian, Portuguese and Lithuanian.
EthinkersLab is a group of software professionals who have been "hands on" in delivering quality solutions in time. Technology excites us. We are quick to adopt to the emerging trends in technology. We are continuously honing our skills that speak for themselves in our deliveries.
10+ years experience as developer, leader and PM position. Deep knowledge about solving problems of Big projects: search engine, data mining, HA system, load balancing, logging system... innovative, open-minded, ability to work in group as well as individually, diligent, problem solving skill
We are a US based consulting group providing secure solutions in big data, including Amazon Web Services (AWS) implementation and Hadoop solutions. Our developers and associates are all US based residents and honors graduates from American universities, assuring our clients of the highest level of expertise as well as clear and concise communication. This guarantees the shortest turnaround time for your project while keeping your data secure in the US. Pier Group Consulting is an AWS Consulting Partner specializing in cloud enablement. We focus on AWS and itÂs products, including Amazon S3, EC2, RDS, EMR and Elastic Beanstalk. W-9 available when required.
A freelance consultancy for financial modelling /backtesting packages using Quant Strategies of Trading using latest Technologies catering to customer needs leveraging extensive experience and skillsets using R prog language/C++/C#/.NET/Java/RDBMS's/Big Data/Algorithms/Statistical Models in Financial Fixed Income,DerivativesTrading Domain.
20 years of professional programming experience. Expert in Java and C++ on Mac, Windows and Linux. 10+ years of high scalability, server design and implementation experience. 2+ years of professional Hadoop and HBase development and maintenance in a production environment.
I build and maintain several types of web based projects, from sites for small and medium businesses, E-commerce platforms, Events (conferences and festivals), dashboards for marketers and BI analysts and facebook apps. I use technologies such as Python, Django, backbone.js, Backbone.Marionette on a daily basis. I'm also familiar with different datastore systems such as MySQL, MongoDB,Â
Technology consulting company focused on Big Data, Amazon Web Services and Social Media applications.
I love working in start up's and follow agile engineering. I am religious about lean startup methodologies. Products I have engineered using lean methodologies have raised millions of dollars in funding.
I have 3.5+ years of Hadoop + Data engineering. I have worked on variety of different technologies, spanning from Hive ODBC Driver Development to Recommendation engine development for online streaming media services. I deliver very high quality work, able to quickly grasp any new technology. [I'm new on elance.com, I have a 5 star profile on oDesk at https://www.odesk.com/users/~019b5641196dbbc2c3]
I have been working on big data technology since 2 years and I have worked both in Web and Desktop App developement using Java.
I am well versed with latest BigData technology Apache Hadoop and its ecosystems like Pig, Hive, Hbase, Zookeeper etc. I can help setup Apache Hadoop in standalone mode, pseudo distributed mode and fully distributed mode. I can even help you administer Apache Hadoop for efficient use of resources. I have working knowledge of Nosql technologies like cassandra/hbase/couchbase/mongodb. I do design the architecture, do capacity planning and resource utilization to give the better performance.
Seasoned IT consultant for information systems deployments, with experience working in international distributed environments. At present, I'm focusing on Big Data analytics, in order to develop tools that help organizations to get more relevant insights from their available data. Recently, I became a Cloudera Certified Developer for Apache Hadoop (CCDH) For more info you can chek my Linkedin profile: http://es.linkedin.com/in/javimartins/en
9 years of overall experience in Enterprise software architecture, design and development. BigData,Data Science/Mining,Java.
I have six years of experience in software development. Successfully designed and developed over three software projects,both more than one million US dollars. I am very eager to get a job, If you give me a chance to serve you, I can ensure you high quality and fast turnarounds.
I'm an expert in reverse engineering closed systems, data extraction and processing and web automation
IntelliGrape Software is a premium technology company with proficiency in Big Data technologies like Hadoop ecosystem, R, RMR, NoSql, Machine Learning, Visualization etc. Highlights of our Big Data practices: ? Cloudera Certified Hadoop developers and Administrators ? DataStax Silver Partner ? MongoDB Advanced Partner ? Proficient in Hadoop Ecosystem, Machine Learning,Visualisation and ETL Tools etc ? Experts in NoSQL data stores ? AWS certified architects ? MongoDB certified developers
We are having Skilled Data Scientists & Hadoop Developers with excellent programming skills (Apache Hive, Apache Pig, HBase, MongoDB, Cassandra, Spark, Flume, Sqoop, SQL, Map Reduce, R, Mahout, Tableau, Pentaho, Java, JSP,MS Excel.
SLU Dev provides professional IT Management to global clients that range from small Start-Up companies to Fortune 500 firms. We are an Amazon Web Services Standard Partner and have been working with all Amazon Web Services since its inception. We are managed systems experts with over a decade of experience. Amazon Cloud, Rackspace, Google Compute and Microsoft Azure are all services we work with daily. We guarantee all our work. So there's never a risk working with us. Feel free to contact us to talk about your project before making any commitments. We specialize in Performance Tuning large cloud deployments, Cloud Monitoring, Cloud Growth strategies and Developer support.
15+ years experience. MS, Computer Science, Johns Hopkins Univerisity. MBA, Marketing Big-Data, University of Missouri, Kansas City
Highly productive senior software engineer, software architect and noSql / search engine expert. Highly experimented Java/JEE senior developer. Expert with Service Oriented Architecture (SOA) and Hadoop / Hbase mapreduce High experience with Integration Broker, ESB, Messaging. + NoSql: graph database, bigtable, search engine. + Machine learning + Scrum/agile methodology. + Comfortable with short release cycles. + Experience with scala and play2 + Technical leadership and mentoring Specialties: Software Architecture design for high availability, Scrum(Agile) - continuous integration, unit tests, acceptance Tests. Java, JEE, Spring IOC, AOP, Maven, JPA, Hibernate, MVC, Servlet container, Restfull, Solr, Lucene, ElasticSearch, MySql, Hadoop, Hbase, Redis, graph database, Neo4j, rabbitMq, James server, ant, scripting, jenkins .
To bring together the knowledge of Networking, programming, Cloud and Open-source tools. Extensive knowledge of various networking protocols of TCP/IP. Worked extensively on Ubuntu, Backtrack, Fedora, CentOS. Implementing Map-Reduce framework in Java using Hadoop on multi-cluster nodes. Specialties: Network Security: Network intrusion and detection.Metasploit, nmap, netcat, HackerDefender, Rootkits Languages :C,Java, SQL, HTML, XML Scripting: Perl, Bash, Python. Other : Hadoop, Map-reduce, Hive, Pig, HBase, Apache Maven,Oozie, sqoop, Apache Ant, ETL process, Apache httpClient, JBOSS, tomcat, openTDSB Tools:Eclipse,Netbeans, Karmapshere Business Intelligence:MicroStrategy
Oracle Certified Developer In Java and J2EE Web Component Developer in Text Analysis,NLP, Python, NLTK Master degree holder Computer Sc. & Engg.
Presently pursuing Master's in Enterprise Business Analytics in NUS-National University Of Singapore.Been acknowledged for the ability to understand customer needs and implement strategies to improve customer satisfaction and thus increase revenue stream.Being analytical and organized I have the determination and confidence to implement effective solutions.