Email : firstname.lastname@example.org
Web : Tech stuff | Portfolio | Blog | GitHub | StackOverflow | LinkedIn | Resume (pdf)
San Francisco Bay Area, USA
Hands-on BigData Architect with 15+ years of software development experience seeks a great opportunity to design/lead/develop/launch cool stuff.
- Software Development
Software professional with many years of experience. Worked on enterprise, web, mobile applications. Now focusing on Hadoop, BigData and NOSQL. Given talks at various meetups (Talks); Open Source contributor (Github). Very proactive in learning and leveraging emerging technologies. Mentored engineers; believe in leading by doing.
Hadoop / BigData :
- OS : Linux (system admin level), MacOS, Windows
- Languages: Ruby, Java, PHP, Python, Objective-C / C / C++, Shell (bash)
- Technologies & Tools :
- Web : Ruby on Rails, Tornado (python), LAMP stack
- Databases : mysql (scalability & replication), NOSQL (HBase)
- iPhone/iPad SDK, building APIs for mobile devices
- Web Services: WSDL/XML/JSON, restful, security
- Eclipse, Xcode, Linux/MacOS development environment, GIT, Subversion
- Business Domains : online advertising, location based services, social media analytics (finding top influencers), ecommerce, online classifieds, Enterprise workflow management, RFID
Founder / Principal consultant @ node51.com. (2009 - Present)
Node51 is focused on offering services, training and support around Hadoop ecosystem.
Principal Hadoop Consultant @ Hitachi Data Systems [2011 May - Present]
I am the lead for Analytics component for Hitachi Data platform. Successfully virtualized Hadoop (one of the first in industry). Adopted Linux KVM as the virtualization container. Optimized disk IO path and network IO path between virtual machines and hosts to provide optimal IO performance for Hadoop.
Developed a custom FileSystem plugin for Hadoop so it can access files on Hitachi Data Platform. This plugin allows Hadoop MapReduce programs, HBase, Pig and Hive to work unmodified and access files directly. The plugin also provided data locality for Hadoop across host nodes and virtual machines.
Advised file system team on optimizing IO for Hadoop / analytics work loads.
Setup and benchmarked Hadoop/HBase clusters for internal use
wrote data ingesters and map reduce programs
lots of scripting (python + shell) to provision and spin up virtualized hadoop clusters
mentored an intern working on recommendation engine on big data
Hadoop Architect (consulting) and Instructor @ ThirdEye Consulting Services, Santa Clara,CA [2011 - Present]
I develop and teach Hadoop classes for THirdEye on following subjects : Hadoop Intro, Hadoop admin, HBase, Pig, Amazon EMR.
Developed materials / labs for HortonWorks for their hadoop training course
participated in customer engagements & POCs.
Hadoop / Data warehouse Consultant @ Adpredictive, San Francisco (Online Advertising / Targeting) [2009 Nov - 2011 Feb]
Adpredictive is an online advertisement targeting company. I worked closely with the CTO in helping them scale to 100s millions of impressions per day.
- Hadoop data ware housing on Amazon EC2:
Adpredictive serves up 100s millions of impressions per day and collects a lot of data.
I have architected and developed a scalable, cost
effective, and fault tolerant data ware house system on
Amazon EC2 Cloud. Developed MapReduce/EMR
jobs to analyze the data and provide heuristics and reports. The heuristics were used for improving campaign targeting and efficiency.
Technologies used : Hadoop, Hbase, Sqoop, Scribe, Java, Map Reduce , Amazon EC2 infrastructure, Amazon Elastic Map Reduce (EMR), MySQL, shell scripts
more || my talk on EMR experience
- high performant web service infrastructure
We needed a solution to bid in real time on advertising
exchanges. Response time for web services built on typical LAMP
(php) stack was too slow for our needs. So I engineered a high
performant / high volume / highly scalable platform for bidding
in real-time (More
details). Our bidding platform response time improved
from 300ms to 50 ms
Technologies used : Python, Tornado Web Server, MySQL, ProtoBuf
full chronological resume available: http://sujee.net/resume/sujee_maniyam_full.html
iPhone / Ruby-on-Rails Consulting Engineer, Hitachi America Ltd, Santa Clara [2009 Oct - 2011 Mar]
Lead engineer @ Uloop (2008 - 2009 Oct)
Software Engineer @ IBM, Burlingame, CA : (2002 Feb - 2008)
Software Engineer @ Crossworlds, Burlingame, CA : (1999 July - 2002 Feb)
my open source repository at Github.com/sujee
- Founder CoverCake.com
CoverCake finds top influencers for Books in social media. Covercake also maintains database of books featured in TV Shows, Radio Shows and various blogs
- Creator Discounts For Me
Creator of iphone app and website that lists member & discounts (public radio stations, auto clubs )
[Ruby on Rails, data spidering, geocoding, REST API / JSON, Iphone SDK]
Other projects: Facebook app, a card game, google maps mashup, project timer
- Technical articles and HOWTOs, code contributions
Patents and Publications
B.S. in Computer Engineering, University of Melbourne
Australia , 1998
full chronological resume : http://sujee.net/resume/sujee_maniyam_full.html