You are on page 1of 11

Google Resume

Chanjun Yang, Business System Integrator


I joined Google in Jan 2010.
Significant Project Accomplishments at Google (Reverse Chronological Order)
2012
Prozac (PLX Row-level Security)​ - Support row-level security in PLX using GRBAC
and Rolestore.
Worked with team on technical designs.
Developed the entire C++ GRBAC client library, ​a key component of the
system.
Developed the C++ APIary client that talks to Rolestore, which enables PLX
to become one of the early Rolestore users.
Sites​ | ​Code
Rolestore ​- An APIary service that maintains GRBAC policies.
Implemented features for both the service and the Rolestore client.
Designed and implemented the monitoring solution.
2011
Marketing ROI reporting​ - ​Measure return on investment of marketing activities,
focusing on Adword acquisitions and Chrome.
Designed and implemented Java and tenzing ETL pipelines to analyze ROI
metrics for both SBM (small business marketing) and Chrome subject areas.
Provided advisory support to other developers.
Deployed and fully automated the ETL pipelines.
Continually released new features based on user requirements.
Code
Quilt ​- ​A self service dashboard application that enables users to easily create
dashboards and organize their reports
Proposed, designed, and implemented the web application using Google
technologies such as GWT, gviz, Patchwork, and integrated with different
reporting tools such as Trix, ABI, TeaLeaves, etc.
Led the development effort from a 20% project to a live product.
Led and reviewed another developer’s work.
Maintained the release process and production environment.
Sites​ | ​Code
Finance Data Warehouse / Highlight
Built Highlight gadget v2.
Enabled ​reports that would be impossible to deliver otherwise.
HFM project
Identified affected tables for FDW and discussed design options and
possible issues with HFM team.
Production support
Continued to be the go-to person for Oracle extraction related issues.
Production on-call.
2010
SETL​ - Streaming ETL library that supports frequent data fresh in data warehouse.
Integrated Oracle extraction process with SETL.
Worked with DBE ​to get streaming data from Oracle.
Finance Data Warehouse / Highlight ​- ETL / data access components relating to
reporting on financial data.
Built ETL processes and reports for P2P reporting.
Participated in Epoxy migration.
Built the Highlight gadget
A key component for delivering standard reports to FP&A.
Participated in production Support
TOFU project - Total Oracle upgrade
Performed oracle upgrade tests for FDW and ensured smooth
transition to the new ETL servers.
Oracle extraction for FDW​ - A process that extracts source data from Oracle,
removes sensitive information, and stores them in Prod.
Redesigned and implemented the Oracle extraction process.
Significant performance improvement (from 4 hours to 1 hour for
extracting ~200 tables).
More stable and much fewer failures.
More scalable framework.
Enabled FDW developers to extract data in a configuration-based
manner.
Production deployment.
Sites​ | ​Code
Significant Non-Project Accomplishments at Google
Conducted 50+ interviews.
Mentored one noogler.
Got Java & C++ readability.
Awards at Google
Bizapps Top5 Award for supporting row-level access control in PLX (2012).
CorpEng Award & Bizapps Top5 Award for TOFU oracle upgrade (2011).
Bass Award from Finance for building standard reports (2010).
6 peer bonus (2010 - 2012).
Experience Prior to Google
Software Engineer, ​TechExcel Inc,​ L ​ afayette, CA, Jul 2006- Dec 2009
Education
Iowa State University, ​Ames, IA
M.S. Computer Science, 2006
Peking University, ​Beijing, China
B.S. Computer Science, 2003
Resume-2
Oracle DBA Resume

Google Resume

Helen Polonskaya, Database Administrator, Database Engineering

Significant Project Accomplishments at Google

Jan 2012 – Present

DART Database Support


Primary DBA on DART Oracle RAC OLTP databases PROD and API.
Secondary DBA on support for Doubleclick’s Oracle Financial Applications and Databases (OFPROD), Billing
Databases (IMS/CORPDW), Remedy Databases (REMEDY)

● Primary database support for all issues including performance, capacity, data integrity, etc.
● Immediate support for all production issues to keep maximum database availability and performance.
● Fixing issues such as memory errors, undo max out, query performance slowness.
● Investigating and providing solutions for data discrepancy issues.
● Investigating Oracle internal errors and working with Oracle support for bug fixes
● To ensure Data security completed Oracle critical security patch upgrades for both PROD and API in QA
and production environments.

DFA6 Billing Address Management Feature. Lead DBA


● Designed, developed and implemented all necessary database objects (tables,views, procedures, shell
scripts) allowing customers to self manage billing information for all billing entities.
● Worked closely with DFA6 engineering, Billing and PM teams to ensure smooth production rollout.

IM/ADW Database merge. Primary DBA


● Retired IM database (all database objects have been merged into ADW database)
● Redesigned load keyvalue process
● Worked closely with SA, Backend/AF eng teams to ensure:
Testing completeness
Migration transparency – no client impact
As a result publishing time for load keyvalue process dropped by 90% from 10 hours to 1 hour, and overall query
performance for AF application improved by as much as 30%

Migration of DFA exposure activity and exposure performance tables from using monthly partitions to
weekly partitions on PDW database. Lead DBA
● Setup new loading jobs
● Redesigned tables structure to support by-weekly partitioning
● Load 16 months worth of data into new tables (29,228,672,016 records)
● Setup parallel production
● Worked with URS engineering team to make sure reporting works with no issues
As a result DB loading time ~50% improvement, query performance gain 20-90%, and storage savings – 50TB.

Move Dart development databases DPROD and TAPI to new server. Primary DBA
● Installed oracle software
● Applied all necessary patches
● Backup and restore both databases
● Successfully handed off databases to Engineering teams on time
As a result development environment became a replica of the production environment ensuring proper development
and testing.

DFP-XFP MIGRATION
● Analysis of DFP Features for DFP-XFP migrations for all DFP clients.
● Provided XFP migration team with Features usage data for about 50 new features to ensure successful
migration to XFP
● DB optimization for slow running extractions queries
● Voluntarily provided DBA support for over-the-weekend YouTube migration.

SIT/QA databases refresh: Primary DBA


● In DCLK infrastructure it is not setup easily to clone large amount of data across networks. Overall size for
all databases had grown to about 80TB. DBA team worked with SA Unix team coming up some creative
solutions to move the data quickly from production network to QA, and restore all 80TB into QA and
recovered them to a pre-set time
● all DB changes were applied to all refreshed QA databases,
● Total down time to switch to new enviroment record short of 2 hours. New environment ensures the quality
production releases for all DART products and successfully handed to DCLK QA team on schedule

Mar 2008 – Dec 2011

Development of ETL Self-recovered Framework Lead DBA


● Designed, developed and implemented the Self-recovered ETL framework
● One of purposes of this project was to make production support and maintenance for ETL processes as easy
as possible with minimum manual interaction. (Extensive use of external tables, exchange partitions and
data driven design)

Hourly Dimension Replication Framework Lead DBA


● Redesigned the hourly dimension replication process to support multiple replicas with a more scalable and
efficient model.

ADW ad_tag_qa_report - functionality and monitoring enhancements. Lead DBA


● Moved hourly report used by all DFA and DFP clients to a new ETL framework
● As a result time to publish for this report was reduced from 7-8 hours to 5-20 minutes

DFA Media cost report Optimization. Lead DBA


● Complete DB redesign for the Media cost report used by all DFA clients.
● As a result publishing time for this report was reduced from 8-15 hours to 1h 30 minutes.

Prod DB reviewads_info table cleanup: Lead DBA

● Designed and implemented new framework for review_ads_info table cleanup process. This table is being
used by all DFP and some DFA clients. Whenever it reaches a 6M records it needs to be rebuilt otherwise
the UI would start getting timeout errors. We used to do during the maintenance window only, but with the
new framework, we were able to do this outside of the maintenance window without any downtime
And process time reduced from 1 hour to 5 minutes.
SIT databases refresh: Primary DBA
● In DCLK infrastructure it is not setup easily to clone large amount of data across networks. The last one the
databases had grown to about 50TB. DBA team worked with SA Unix team coming up some creative
solutions to move the data quickly from production network to QA, and restore all 50TB into QA and
recovered them to a pre-set time
● all DB changes will be applied to all refreshed QA databases,
● syncing up all replications and loading jobs over 3 months period.
● Most work was done during holiday season and weekend to minimize impact to QA/Eng teams. Total down
time was a record short of 2 week days. New environment ensures the quality production releases for all
DART products and successfully handed to DCLK QA team ahead of schedule

MTVi Reporting Redesign Lead DBA


● Designed, developed and Implemented a multi-level pre-aggregation processing and publish solution for
MTVi reporting,
● As result achived 40x query performance improvement One of our biggest customer MTV Interactive
has expressed a great satisfaction to us

DW RAC Migration Oracle DBA

● We moved away from 8 year-old dying SUN E6900 (crashed twice within 20 hours before the migration!),
Now we are on two nodes SUN M5000 Oracle 10g release 2 RAC, performance has shown great
improvement, I was involved in the project from beginning to the end (9 months in total),
● Setup all replication jobs
● ETL (loading jobs),
● Working with URS and Backend applications on performance problems
● Redesigned the longest DB loading jobs used to take 2 to 3 hours performance time now 10 to 20 minutes.
● On May 3rd during the migration, I was on-call DBA as well as part of rollout DBA team and old database
crashed at 3 a. m on May 3​rd​ and was down for 2 hours while SA were able to recover the box. Migration
was a great success.
● I won SPOT award from upper management for the effort I contributed.

AOL Reporting Database Split Lead DBA


● Separation of logically autonomous part of the common data warehouse data into its own database instance
in order to remove some load from the main data warehouse and to better manage AOL relationship
● Copying all client related reporting database objects (tables, views, procedures and etc ) from 2 separate
● databases (DW and SPUR) into one database.
● Seting up and modifying all replication jobs in order to use new multi-site dimensional replication
framework Moving all ETL jobs to self- recover framework
● Keeping new environment in sync during parallel production and finally switching over to it.
● I won SPOT award from upper management for the effort I contributed.

SPUR RAC Migration Oracle DBA


● SPUR DB is the most important and complex data warehouse in the DCLK Environment; it is the back end
for all scheduled/ad hoc reporting Capabilities (except for canned reporting). It has been a challenging
environment and almost every year we had to upgrade the hardware in order to accommodate growth. For
better scalability, we have decided to move this database instance from a single node to a new Oracle RAC
environment where we could simply add nodes if more throughput was needed. I was involved in the
project from beginning to the end (18 months in total).
● Setup all replication jobs.
● ETL (loading jobs).
● Working with URS application on performance problems
● Keeping new environment in sync during parallel production and finally switching over to it

IM Report Migration to ADW: Lead DBA


● This project involved moving all data pertaining to the inventory management from SPUR to the newly
created ADW RAC. The reason for this is to move some resource consumption from SPUR and also
prepare for when the business will turn on this feature for every client.
● Copying all client related reporting database objects (tables, views, procedures, packages and etc
● Setting up and modifying all replication jobs in order to use new multi-site dimensional replication
framework
● Moving all ETL jobs to self-recover framework
● Keeping new environment in sync during parallel production and finally switching over to it.

DART PUSHER Performance Improvement: Lead DBA


● As one of the main backend processes, PUSHER is a very critical piece of DART Adserving. It consists of
three levels of PUSH, hourly, delta and full, if any of them fall behinds to certain extent, clients will not be
able to see their ads being served on time or at all. We had bottleneck on the PUSHER queries due to the
legacy design, all PUSHER clusters are pulling data from a single data source from PROD, competing for
the same resource within themselves and other applications. I have come up with new design and achieved
performance improvement significantly
● Full load from 10 hours to 4.5 hours.
● Delta (15 minutes load) from 40 minutes to 10 minutes 1 hour
● Hourly from 1 hour 15 minutes to 30 minutes

​ DART API database 11g R1 upgrade Primary DBA


● Installed oracle 11g db software and successfully migrated production OLTP database to 11gr1.

DART IM database 11g R2 upgrade, Primary DBA


● Oracle ASM migration completed successfully, 3 hours ahead of schedule and as results showed 10% to
40% performance gain

PROD DB Clean up. Lead DBA


● Two years long effort to reduce overall DB load.
● Provided coordination with DART, Search, Adserving and Backend teams to define criteria by which to
select data to be archived.
● Developed and implemented table-by-table archiving process.
● As a result query performance across all applications was significantly improved

Dimensional Data Archiving for DFA/DFP: Lead DBA


● Defined archiving criteria, test it and developed rollout plans using parallel replication. Archived 40% of
inactive ads in the most heavily used table DM_AD for two main DART reporting databases.

Prior Work Experience:


Oracle DBA, DoubleClick Inc. 2004 - 2008
DBA, GFI Group, 2003
DBA, WorldCo LLC, Financial Services, 2000-2003
DBA/Data Warehouse Administrator ,Priceline Webhouse Club, Inc. 2000
DBA , International Telecommunication Data Systems (Amdocs) 1997 -2000
Education:
Moscow University, M.S. in Computer Science

*************************************************************************************************************
***************************************************************************************************

Resume-3

ccoughlin_resume.doc
Chad P. Coughlin
7274 S Fillmore Cir Denver, CO 80122 Home: 303.698.4426 Mobile: 303.523.5785 chad.coughlin@gmail.com

Relevant Skills
Proficient in several BI tools including Hyperion Essbase 5.2 thru 9.3.1, Hyperion Planning, Essbase
Integration Server, Cognos ReportNet, Framework Manager and Impromptu Series 7, ArcPlan Insight
3.0 & Dynasight 3.0, MS Analysis Services 2000, Business Objects 5.0, Oracle Financial Analyzer 6.3
and Oracle Express 6.2, and STAR Integration server
Managed full life cycle of several Hyperion implementations including selection, design, development
and delivery. Includes identifying end user requirements, tool selection, designing data models to address
key business drivers, designing and developing reports, user acceptance testing, and providing training to
administrators and end-users.
Skilled at automating processes using tools and languages such as HAL, ESSCMD, MAXL, Perl, Python,
and SQL.
Hyperion Solutions 2007 presenter to over 100 conference attendees.

Experience and Qualifications


June 2006- present ​ oogle, Inc​
G Mountain View, CA
​Business Systems Integrator
Project Work
Marketing ROI
Project Manager for marketing reporting initiative to measure return on investment of marketing
activities. This effort automates and standardizes marketing ROI reporting and is used for both strategic
and operational reporting and decision-making. Led cross functional teams, including 3rd party agencies,
to bring together performance data as well as media spend data from multiple, disparate systems.
Managed project timeline, gathered requirements, research data sources, data mapping across systems,
report status to upper management and develop relationships w/ stakeholders and 3rd party agencies.
This tool has been rolled out to 100+ users with over 2400 queries in february (first full month live) and
has led to several other marketing requests.
Prophet
Project/product manager to develop a collaborative budget/forecasting/spending management tool to
help Google managers better run their business. This internal web based tool, built on Google
technology, allows finance and marketers to collaborate in a user friendly environment where they can
organize and manager their spending at a level of granularity that makes sense to them.
Worked w/ cross functional teams (Finance, Eng, Marketing) to gather requirements, scope the work,
mock-up proposed solutions and deliver those solutions to users. Responsible for managing the project
timeline, bugs queue, facilitate UAT and and drive consensus on prioritization.
Rolled out to 30 cost centers, 250+ users, and captures over 60% of marketing spend through the march
forecast.
PO Planning
Project Manager and lead developer for initiative to provide granular level planning at the PO Line level.
As part of an overall expense management initiative this involved cross-team coordination among
several groups both on the Finance and Bizapps side. The new solution allows for financial analysts in
G&A, Eng, and PM to plan and report at the PO line level, providing for more accurate forecasts as well
as providing more relevant data to their business partners to help them better run their business. Usage
metrics have shown that over 700 PO lines are being forecast to across 50+ Cost Centers.
Bonus Accrual
IT Lead for project to improve accrual & forecasting accuracy of company bonus and commissions.
Gathered requirements and created calculations to mimic the actual bonus payout as much as possible.
Complexities included handling for terms, transfers, leaves, mid-year starts and many other factors.
Reduced ~$38M variance for 2008 accrual vs. actual payout to an immaterial variance for 2009.
Sal/SJ2 Turndown
Received Bizapps award for leading project to move Hyperion servers out of existing data centers that
were being turned down, and move these servers to other corporate data centers.
Hyperion System 9 Upgrade
Project lead for initiative to upgrade to Hyperion system 9.3.1 from 7x. Project included instituting a
formal failover process to an offsite DR location as well as migrating web and app services to Sun
Solaris.
Implement Channel/Project code planning and reporting
Implemented Channel/Project level planning and reporting in Hyperion allowing users to plan at and
report on a lower level of granularity
Hyperion architecture improvements
Designed and implemented two major Hyperion restructuring initiatives. Used Essbase partitioning to
split environment into smaller purpose built cubes - enabling significant performance improvements
while dealing with tremendous growth. Reduced dense restructure from 2+ hours to 20 min and reduced
forecast calc times by 70%
Drill Through
Implemented Essbase Integration Server to provide drill through capability at all hierarchical levels back
to Journal Detail and AP invoice transactions stored in data warehouse, eliminating the cumbersome
process of users having to login to Oracle Financials and key in parameters to get the transaction detail
behind summarized Essbase data. Usage has grown to ~150 distinct users and over 3K drills per month
Perl to Python Migration
Assisted in project to convert all of our jobs to run in python. In addition added robust monitoring,
logging, and metrics gathering into our jobs
Hyperion team lead in Finance IT organization supporting Hyperion Planning, Essbase, EIS, STAR
Integration Server, DRM and Web Analysis. Manage a group that provides production support,
development, and future design and direction of Hyperion applications.
April 2006- May 2009 ​ he Hackett Group​
T Miami, FL
​Consulting​ ​Manager
Client: Google​ Jul 06- May 2009
Projects listed above.
Client: QAD​ Apr 06- Jul 06
Implemented Essbase Integration server to provide full drill through capability back to transactional
detail.
June 2004-April 2006​ ​ ex Media​
D Englewood, CO
OLAP Developer
Delivered several Essbase cubes to Sales, Marketing and Finance users as part of an overall
initiative to convert ERP systems and develop a new Data Warehouse. Worked with the end users to
define dimensions and develop hierarchies. Built a Star Schema model using Essbase Integration
Services (EIS) to support the Essbase cubes. Used EIS as a link between the relational structure of
the Data Warehouse and the multi-dimensional structure of Essbase allowing for users to drill
through from the summarized data presented by Essbase to the relational detail stored in the
Warehouse. Responsible for training users on the Essbase tool as well as general OLAP concepts
and functionality.
Represented IT department in Data Warehouse steering committee where we met with customers to
gather requirements and share information to help them better define their reporting needs and
prioritize Data Warehouse initiatives.
December 2003-June 2004​ ​Lockheed Martin- TMA/Military Health System
Aurora, CO
Contractor- Business Intelligence
Cognos developer responsible for providing a web reporting solution as part of an overall initiative
to reengineer the Patient Encounter Processing and Reporting system from a mainframe and client
server environment to a multi-tiered, web-enabled architecture.
Perform full life cycle development of new system, including system analysis, requirements
gathering, design, source to target mappings, development of Cognos Catalog, report design and
delivery, and user acceptance testing.
January 2003-December 2003​ ​ADT Security Services​ Aurora, CO
Financial Systems Administrator
Hyperion Essbase Administrator for corporate-wide budgeting, forecasting, and reporting
applications with over 600 users. Designed and taught customized Essbase Excel Add-in class to
over 50 business users.
Redesigned close week process, using Essbase partitioning, to load and calculate current month
actuals in less than 10 minutes, a process that previously took over two hours.
September 2002- January 2003 ​Waste Management​ Houston, TX
​Contractor- Business Intelligence
Developed and designed Essbase applications that provided Finance, HR, and CRM reporting
capabilities. Provided process automation using MAXL and Perl scripts, performance tuning,
testing, and cube migration.
September 1999- January 2002 ​Application Partners​ Atlanta, GA
​Consultant- Business Intelligence
Client: PCA International​ Jul 01- Jan 02
Designed and built Essbase solution to provide P&L and Balance Sheet reporting, leveraging data
from GEAC mainframe which was extracted to SQL Server data mart.
Wrote Esscmd scripts to automate hierarchy maintenance, data loads, and calculations to update an
8 dimensional cube nightly. Reduced monthly reporting cycle from 5 days to 1 day.
Created Web based reporting using ArcPlan 3.0 as a front end to Essbase and SQL Server with drill
to journal detail capability.
Client: Cox Communications​ Dec 01- Jan 02
Utilized ArcPlan 3.0 to design and develop sales, financial, and balanced scorecard reporting
applications as a front end to Hyperion Essbase. Provided training and mentoring to colleagues to
take over in my absence.
Client: Deloitte & Touche LLP ​ Oct 99 – Jun 00 and Jan 01- Jul 01
Designed and built Finance, Capital, Project and Allocation budget templates with write back
capability using ArcPlan as a presentation and drill-back mechanism for Hyperion Essbase.
Created formulas in ArcPlan’s Insight to provide features such as navigation, find functionality,
filtering, copying, allocating, seasonal budgeting patterns, printing, exporting to excel, writing a
transaction log to MS Access, saving and retrieving from Essbase, and data integrity checks on input
Published application to the web through Dynasight and made it available to over 300 users
throughout the U.S.
3​
Designed and taught customized A​ Vision and Essbase Excel Add-in training course for over 300
end-users. Incorporated the training course content with business rules and budgeting processes of
the client.
Client: Autodesk ​ Sep 00- Jan 01
Developed project plan to gather end user requirements, design and develop Business Objects
reports, develop user acceptance testing and provide ownership transition strategy to client within
time and budget limitations.
Client: Equifax ​ Jun 00- Sep 00
Created an eight dimensional Essbase model as a solution for reporting financial and sales revenue
with the ability to drill to detail using Hyperion Integration Server.
June 1995- September 1999​ ​ torage Technology Corporation​
S Louisville, CO
Business Operations Specialist- Data Warehouse Group Jul 98-Sep
99
Technical and Business lead for Oracle Financial Analyzer production system at headquarters as
well as worldwide rollout to International subsidiaries. Responsible for maintaining worldwide
structures for a seven dimensional OFA model with over 120 users.
Senior Financial Analyst Jun 95-Jul 98
Financial support for several engineering, sourcing and manufacturing groups; includes budgeting,
forecasting, and planning for groups with annual spending over $30 Million.

Education and Certifications


Hyperion Essbase Certified Professional (5x & 7x)
Masters of Information Systems,​ August 2000
University of Colorado Denver, CO
Bachelor of Business Administration​, May 1995
​Finance and Marketing Emphasis
University of Colorado, 1993-1995 Boulder, CO
Michigan State University, 1991-1993 East Lansing, MI
Published by ​Google Drive​–​Report Abuse​–Updated automatically every 5 minutes
*************************************************************************************************************
*************************************************************************************************************
**********************************************************************************************

You might also like