You are on page 1of 88

BRCODE India Reporting - Service Line

347933BR BTS DTT


348022BR BTS DTT
337932BR BTS DTT
340834BR BTS DTT
341301BR BTS DTT
342731BR BTS DTT
361739BR BTS DTT
373195BR BTS DTT
368712BR BTS DTT
380533BR BTS DTT
373228BR BTS DTT
386584BR BTS DTT
373234BR BTS DTT
374546BR BTS DTT
390392BR BTS DTT
390393BR BTS DTT
384856BR BTS DTT
396926BR BTS DTT
396927BR BTS DTT
400115BR BTS DTT
399968BR BTS DTT
400120BR BTS DTT
407329BR BTS DTT
407967BR BTS DTT
407968BR BTS DTT
407969BR BTS DTT
409514BR BTS DTT
407971BR BTS DTT
407163BR BTS DTT
406889BR BTS DTT
407438BR BTS DTT
407099BR BTS DTT
407948BR BTS DTT
407448BR BTS DTT
407975BR BTS DTT
407180BR BTS DTT
407186BR BTS DTT
407106BR BTS DTT
407578BR BTS DTT
414701BR BTS DTT
414704BR BTS DTT
447879BR BTS DTT
407654BR BTS DTT
407690BR BTS DTT
407269BR BTS DTT
407120BR BTS DTT
407111BR BTS DTT
405595BR BTS DTT
407947BR BTS DTT
405742BR BTS DTT
405745BR BTS DTT
410140BR BTS DTT
418108BR BTS DTT
424222BR BTS DTT
424265BR BTS DTT
424228BR BTS DTT
424230BR BTS DTT
424233BR BTS DTT
424234BR BTS DTT
422197BR BTS DTT
422073BR BTS DTT
422081BR BTS DTT
422042BR BTS DTT
424114BR BTS DTT
424115BR BTS DTT
422237BR BTS DTT
423782BR BTS DTT
424122BR BTS DTT
424123BR BTS DTT
424147BR BTS DTT
424124BR BTS DTT
424120BR BTS DTT
424139BR BTS DTT
424129BR BTS DTT
424415BR BTS DTT
422462BR BTS DTT
422478BR BTS DTT
423977BR BTS DTT
424410BR BTS DTT
424280BR BTS DTT
424283BR BTS DTT
424316BR BTS DTT
424321BR BTS DTT
424219BR BTS DTT
424223BR BTS DTT
424117BR BTS DTT
424330BR BTS DTT
424334BR BTS DTT
424339BR BTS DTT
424342BR BTS DTT
424345BR BTS DTT
424474BR BTS DTT
423853BR BTS DTT
423838BR BTS DTT
424328BR BTS DTT
424368BR BTS DTT
424370BR BTS DTT
425083BR BTS DTT
424442BR BTS DTT
423989BR BTS DTT
424444BR BTS DTT
424617BR BTS DTT
429617BR BTS DTT
437203BR BTS DTT
429767BR BTS DTT
462588BR BTS DTT
429768BR BTS DTT
429783BR BTS DTT
429217BR BTS DTT
429220BR BTS DTT
462594BR BTS DTT
438202BR BTS DTT
434435BR BTS DTT
434574BR BTS DTT
462598BR BTS DTT
437215BR BTS DTT
435989BR BTS DTT
435990BR BTS DTT
438306BR BTS DTT
438305BR BTS DTT
435991BR BTS DTT
435996BR BTS DTT
431228BR BTS DTT
435875BR BTS DTT
437654BR BTS DTT
433325BR BTS DTT
434600BR BTS DTT
436366BR BTS DTT
434432BR BTS DTT
436378BR BTS DTT
434570BR BTS DTT
462601BR BTS DTT
432844BR BTS DTT
436009BR BTS DTT
434539BR BTS DTT
438406BR BTS DTT
438501BR BTS DTT
438497BR BTS DTT
438237BR BTS DTT
436659BR BTS DTT
438514BR BTS DTT
438511BR BTS DTT
440568BR BTS DTT
440024BR BTS DTT
440672BR BTS DTT
440558BR BTS DTT
439994BR BTS DTT
439774BR BTS DTT
439653BR BTS DTT
440671BR BTS DTT
439789BR BTS DTT
441189BR BTS DTT
440664BR BTS DTT
440503BR BTS DTT
439610BR BTS DTT
439605BR BTS DTT
439768BR BTS DTT
439598BR BTS DTT
439652BR BTS DTT
439787BR BTS DTT
441188BR BTS DTT
439612BR BTS DTT
447099BR BTS DTT
447096BR BTS DTT
447092BR BTS DTT
447086BR BTS DTT
447848BR BTS DTT
447838BR BTS DTT
445851BR BTS DTT
446404BR BTS DTT
446403BR BTS DTT
443540BR BTS DTT
446400BR BTS DTT
442678BR BTS DTT
442704BR BTS DTT
442650BR BTS DTT
441753BR BTS DTT
445997BR BTS DTT
446398BR BTS DTT
445431BR BTS DTT
442646BR BTS DTT
442651BR BTS DTT
447180BR BTS DTT
446889BR BTS DTT
446893BR BTS DTT
448131BR BTS DTT
448133BR BTS DTT
448135BR BTS DTT
446918BR BTS DTT
446942BR BTS DTT
446731BR BTS DTT
446732BR BTS DTT
446851BR BTS DTT
446854BR BTS DTT
446857BR BTS DTT
448354BR BTS DTT
447237BR BTS DTT
446579BR BTS DTT
447520BR BTS DTT
447247BR BTS DTT
447252BR BTS DTT
447258BR BTS DTT
448137BR BTS DTT
448138BR BTS DTT
448140BR BTS DTT
448143BR BTS DTT
447270BR BTS DTT
447296BR BTS DTT
447297BR BTS DTT
447301BR BTS DTT
448271BR BTS DTT
446840BR BTS DTT
446866BR BTS DTT
446868BR BTS DTT
446871BR BTS DTT
446874BR BTS DTT
446878BR BTS DTT
446880BR BTS DTT
446884BR BTS DTT
448292BR BTS DTT
446719BR BTS DTT
446970BR BTS DTT
446984BR BTS DTT
473206BR BTS DTT
473211BR BTS DTT
473215BR BTS DTT
473216BR BTS DTT
473221BR BTS DTT
473334BR BTS DTT
473336BR BTS DTT
473342BR BTS DTT
473346BR BTS DTT
473348BR BTS DTT
445881BR BTS DTT
445919BR BTS DTT
445406BR BTS DTT
445470BR BTS DTT
445463BR BTS DTT
448376BR BTS DTT
448604BR BTS DTT
448612BR BTS DTT
445420BR BTS DTT
448593BR BTS DTT
448622BR BTS DTT
445577BR BTS DTT
445485BR BTS DTT
445563BR BTS DTT
448973BR BTS DTT
448976BR BTS DTT
448970BR BTS DTT
448977BR BTS DTT
445593BR BTS DTT
445689BR BTS DTT
445620BR BTS DTT
445699BR BTS DTT
445703BR BTS DTT
448010BR BTS DTT
445709BR BTS DTT
445712BR BTS DTT
445719BR BTS DTT
445723BR BTS DTT
445727BR BTS DTT
445732BR BTS DTT
445736BR BTS DTT
448037BR BTS DTT
448962BR BTS DTT
474681BR BTS DTT
451645BR BTS DTT
450353BR BTS DTT
450355BR BTS DTT
450351BR BTS DTT
450352BR BTS DTT
450380BR BTS DTT
450381BR BTS DTT
450383BR BTS DTT
451646BR BTS DTT
450406BR BTS DTT
473226BR BTS DTT
450349BR BTS DTT
450348BR BTS DTT
450350BR BTS DTT
450343BR BTS DTT
450344BR BTS DTT
450345BR BTS DTT
450346BR BTS DTT
451885BR BTS DTT
451887BR BTS DTT
451648BR BTS DTT
450398BR BTS DTT
450402BR BTS DTT
450405BR BTS DTT
450384BR BTS DTT
450386BR BTS DTT
450387BR BTS DTT
451780BR BTS DTT
451650BR BTS DTT
450412BR BTS DTT
450408BR BTS DTT
450420BR BTS DTT
450419BR BTS DTT
450409BR BTS DTT
450422BR BTS DTT
450424BR BTS DTT
450430BR BTS DTT
450425BR BTS DTT
450428BR BTS DTT
452427BR BTS DTT
452428BR BTS DTT
463808BR BTS DTT
471768BR BTS DTT
463833BR BTS DTT
463835BR BTS DTT
461562BR BTS DTT
463649BR BTS DTT
461576BR BTS DTT
463868BR BTS DTT
463869BR BTS DTT
463871BR BTS DTT
463872BR BTS DTT
456669BR BTS DTT
456670BR BTS DTT
473228BR BTS DTT
473231BR BTS DTT
463857BR BTS DTT
463944BR BTS DTT
463859BR BTS DTT
458108BR BTS DTT
463873BR BTS DTT
463866BR BTS DTT
463140BR BTS DTT
463147BR BTS DTT
463151BR BTS DTT
464014BR BTS DTT
463829BR BTS DTT
462086BR BTS DTT
461583BR BTS DTT
462089BR BTS DTT
462091BR BTS DTT
462096BR BTS DTT
462410BR BTS DTT
462417BR BTS DTT
462423BR BTS DTT
456672BR BTS DTT
463854BR BTS DTT
462147BR BTS DTT
462431BR BTS DTT
462156BR BTS DTT
462446BR BTS DTT
462165BR BTS DTT
463160BR BTS DTT
463165BR BTS DTT
463168BR BTS DTT
456111BR BTS DTT
456141BR BTS DTT
456246BR BTS DTT
456263BR BTS DTT
462207BR BTS DTT
462219BR BTS DTT
462225BR BTS DTT
462229BR BTS DTT
462232BR BTS DTT
462238BR BTS DTT
462254BR BTS DTT
462263BR BTS DTT
462272BR BTS DTT
462290BR BTS DTT
462301BR BTS DTT
462309BR BTS DTT
462319BR BTS DTT
463811BR BTS DTT
463477BR BTS DTT
463845BR BTS DTT
463964BR BTS DTT
463843BR BTS DTT
463850BR BTS DTT
457748BR BTS DTT
457749BR BTS DTT
463270BR BTS DTT
463976BR BTS DTT
463855BR BTS DTT
463849BR BTS DTT
463841BR BTS DTT
462306BR BTS DTT
462330BR BTS DTT
469638BR BTS DTT
469641BR BTS DTT
469629BR BTS DTT
469642BR BTS DTT
469646BR BTS DTT
469633BR BTS DTT
464688BR BTS DTT
471785BR BTS DTT
471826BR BTS DTT
471833BR BTS DTT
471840BR BTS DTT
467001BR BTS DTT
467004BR BTS DTT
467012BR BTS DTT
472588BR BTS DTT
467025BR BTS DTT
467028BR BTS DTT
467031BR BTS DTT
466993BR BTS DTT
472274BR BTS DTT
471889BR BTS DTT
471903BR BTS DTT
471910BR BTS DTT
471921BR BTS DTT
471931BR BTS DTT
471944BR BTS DTT
471955BR BTS DTT
473153BR BTS DTT
471962BR BTS DTT
473179BR BTS DTT
472037BR BTS DTT
473196BR BTS DTT
472058BR BTS DTT
472077BR BTS DTT
472096BR BTS DTT
472102BR BTS DTT
472110BR BTS DTT
472127BR BTS DTT
472134BR BTS DTT
472139BR BTS DTT
472143BR BTS DTT
472233BR BTS DTT
472280BR BTS DTT
472284BR BTS DTT
472287BR BTS DTT
472337BR BTS DTT
472288BR BTS DTT
472294BR BTS DTT
472299BR BTS DTT
472300BR BTS DTT
472302BR BTS DTT
472304BR BTS DTT
472339BR BTS DTT
472305BR BTS DTT
472306BR BTS DTT
472308BR BTS DTT
472481BR BTS DTT
472309BR BTS DTT
472311BR BTS DTT
472313BR BTS DTT
471063BR BTS DTT
473233BR BTS DTT
473236BR BTS DTT
472980BR BTS DTT
475063BR BTS DTT
472764BR BTS DTT
472784BR BTS DTT
472790BR BTS DTT
472796BR BTS DTT
472812BR BTS DTT
472820BR BTS DTT
472977BR BTS DTT
473004BR BTS DTT
473079BR BTS DTT
473084BR BTS DTT
473089BR BTS DTT
473091BR BTS DTT
473113BR BTS DTT
473120BR BTS DTT
473127BR BTS DTT
474686BR BTS DTT
474688BR BTS DTT
474703BR BTS DTT
474705BR BTS DTT
473143BR BTS DTT
473151BR BTS DTT
473157BR BTS DTT
473164BR BTS DTT
473168BR BTS DTT
473174BR BTS DTT
473181BR BTS DTT
473186BR BTS DTT
473262BR BTS DTT
473266BR BTS DTT
475097BR BTS DTT
472531BR BTS DTT
472533BR BTS DTT
472536BR BTS DTT
472539BR BTS DTT
472542BR BTS DTT
472545BR BTS DTT
472547BR BTS DTT
472549BR BTS DTT
472550BR BTS DTT
472551BR BTS DTT
472552BR BTS DTT
472555BR BTS DTT
472558BR BTS DTT
472560BR BTS DTT
473240BR BTS DTT
473244BR BTS DTT
473247BR BTS DTT
473249BR BTS DTT
473252BR BTS DTT
473257BR BTS DTT
472564BR BTS DTT
472566BR BTS DTT
472568BR BTS DTT
471990BR BTS DTT
471994BR BTS DTT
472000BR BTS DTT
472006BR BTS DTT
472010BR BTS DTT
472015BR BTS DTT
472055BR BTS DTT
472061BR BTS DTT
472085BR BTS DTT
472089BR BTS DTT
472113BR BTS DTT
472116BR BTS DTT
472123BR BTS DTT
472135BR BTS DTT
472137BR BTS DTT
472140BR BTS DTT
472141BR BTS DTT
472145BR BTS DTT
472163BR BTS DTT
470822BR BTS DTT
470828BR BTS DTT
475242BR BTS DTT
475069BR BTS DTT
473259BR BTS DTT
470701BR BTS DTT
475081BR BTS DTT
473290BR BTS DTT
473299BR BTS DTT
472971BR BTS DTT
475065BR BTS DTT
473359BR BTS DTT
473362BR BTS DTT
473366BR BTS DTT
473369BR BTS DTT
473410BR BTS DTT
473420BR BTS DTT
473428BR BTS DTT
473438BR BTS DTT
473453BR BTS DTT
473459BR BTS DTT
473463BR BTS DTT
473530BR BTS DTT
473538BR BTS DTT
473548BR BTS DTT
473556BR BTS DTT
473573BR BTS DTT
473580BR BTS DTT
473596BR BTS DTT
473607BR BTS DTT
473617BR BTS DTT
473622BR BTS DTT
473628BR BTS DTT
473637BR BTS DTT
473650BR BTS DTT
473664BR BTS DTT
473673BR BTS DTT
473686BR BTS DTT
473700BR BTS DTT
475102BR BTS DTT
472183BR BTS DTT
472193BR BTS DTT
472207BR BTS DTT
472211BR BTS DTT
471652BR BTS DTT
471708BR BTS DTT
471414BR BTS DTT
471555BR BTS DTT
471654BR BTS DTT
471856BR BTS DTT
471868BR BTS DTT
471881BR BTS DTT
471898BR BTS DTT
471923BR BTS DTT
471859BR BTS DTT
471839BR BTS DTT
471870BR BTS DTT
471885BR BTS DTT
471900BR BTS DTT
471862BR BTS DTT
471844BR BTS DTT
471873BR BTS DTT
471905BR BTS DTT
471932BR BTS DTT
471864BR BTS DTT
471713BR BTS DTT
471763BR BTS DTT
471521BR BTS DTT
471561BR BTS DTT
471662BR BTS DTT
471716BR BTS DTT
471771BR BTS DTT
471527BR BTS DTT
471479BR BTS DTT
471617BR BTS DTT
471669BR BTS DTT
471718BR BTS DTT
471774BR BTS DTT
471529BR BTS DTT
471620BR BTS DTT
471679BR BTS DTT
471779BR BTS DTT
471532BR BTS DTT
471486BR BTS DTT
471684BR BTS DTT
471723BR BTS DTT
471783BR BTS DTT
471536BR BTS DTT
471492BR BTS DTT
471627BR BTS DTT
471689BR BTS DTT
470921BR BTS DTT
475086BR BTS DTT
470944BR BTS DTT
470998BR BTS DTT
471017BR BTS DTT
471046BR BTS DTT
470704BR BTS DTT
475088BR BTS DTT
475091BR BTS DTT
473709BR BTS DTT
473721BR BTS DTT
475319BR BTS DTT
475322BR BTS DTT
471742BR BTS DTT
471744BR BTS DTT
471746BR BTS DTT
471748BR BTS DTT
472747BR BTS DTT
473002BR BTS DTT
473731BR BTS DTT
473007BR BTS DTT
472750BR BTS DTT
473642BR BTS DTT
473665BR BTS DTT
473684BR BTS DTT
471541BR BTS DTT
471497BR BTS DTT
473826BR BTS DTT
473615BR BTS DTT
473648BR BTS DTT
473672BR BTS DTT
475160BR BTS DTT
475163BR BTS DTT
471500BR BTS DTT
471634BR BTS DTT
471693BR BTS DTT
471918BR BTS DTT
470936BR BTS DTT
473014BR BTS DTT
471737BR BTS DTT
473598BR BTS DTT
473619BR BTS DTT
473364BR BTS DTT
471502BR BTS DTT
471637BR BTS DTT
475258BR BTS DTT
475260BR BTS DTT
470995BR BTS DTT
474799BR BTS DTT
474804BR BTS DTT
474810BR BTS DTT
475170BR BTS DTT
475176BR BTS DTT
474822BR BTS DTT
471650BR BTS DTT
471659BR BTS DTT
471594BR BTS DTT
471606BR BTS DTT
471677BR BTS DTT
472195BR BTS DTT
472198BR BTS DTT
472200BR BTS DTT
472203BR BTS DTT
472206BR BTS DTT
472148BR BTS DTT
472124BR BTS DTT
473371BR BTS DTT
475285BR BTS DTT
475268BR BTS DTT
474838BR BTS DTT
473745BR BTS DTT
473389BR BTS DTT
473373BR BTS DTT
472237BR BTS DTT
472222BR BTS DTT
472220BR BTS DTT
472151BR BTS DTT
472161BR BTS DTT
472236BR BTS DTT
472240BR BTS DTT
472224BR BTS DTT
475180BR BTS DTT
475183BR BTS DTT
472162BR BTS DTT
472177BR BTS DTT
472144BR BTS DTT
472129BR BTS DTT
471010BR BTS DTT
474849BR BTS DTT
475326BR BTS DTT
472149BR BTS DTT
472227BR BTS DTT
472232BR BTS DTT
472234BR BTS DTT
473377BR BTS DTT
473379BR BTS DTT
472166BR BTS DTT
472179BR BTS DTT
475270BR BTS DTT
475273BR BTS DTT
475277BR BTS DTT
471018BR BTS DTT
474857BR BTS DTT
474863BR BTS DTT
472187BR BTS DTT
471688BR BTS DTT
471543BR BTS DTT
471557BR BTS DTT
471569BR BTS DTT
471583BR BTS DTT
471655BR BTS DTT
471663BR BTS DTT
471600BR BTS DTT
471674BR BTS DTT
471610BR BTS DTT
471683BR BTS DTT
481365BR BTS DTT
488817BR BTS DTT
481366BR BTS DTT
481367BR BTS DTT
483112BR BTS DTT
488076BR BTS DTT
492007BR BTS DTT
481671BR BTS DTT
479296BR BTS DTT
476321BR BTS DTT
476326BR BTS DTT
476328BR BTS DTT
476332BR BTS DTT
476346BR BTS DTT
479964BR BTS DTT
479966BR BTS DTT
484492BR BTS DTT
484497BR BTS DTT
484499BR BTS DTT
484501BR BTS DTT
484502BR BTS DTT
479801BR BTS DTT
488517BR BTS DTT
483115BR BTS DTT
493092BR BTS DTT
477771BR BTS DTT
477774BR BTS DTT
477776BR BTS DTT
477779BR BTS DTT
477783BR BTS DTT
478500BR BTS DTT
478507BR BTS DTT
478509BR BTS DTT
478511BR BTS DTT
478514BR BTS DTT
478519BR BTS DTT
478520BR BTS DTT
478522BR BTS DTT
478525BR BTS DTT
478359BR BTS DTT
478363BR BTS DTT
478367BR BTS DTT
478369BR BTS DTT
476864BR BTS DTT
476353BR BTS DTT
476357BR BTS DTT
476374BR BTS DTT
476377BR BTS DTT
490155BR BTS DTT
479973BR BTS DTT
479975BR BTS DTT
484530BR BTS DTT
484531BR BTS DTT
484505BR BTS DTT
484506BR BTS DTT
484510BR BTS DTT
484481BR BTS DTT
484482BR BTS DTT
482690BR BTS DTT
477785BR BTS DTT
477787BR BTS DTT
477789BR BTS DTT
477792BR BTS DTT
477796BR BTS DTT
478529BR BTS DTT
478533BR BTS DTT
478632BR BTS DTT
478636BR BTS DTT
478639BR BTS DTT
478644BR BTS DTT
478648BR BTS DTT
478653BR BTS DTT
478657BR BTS DTT
478660BR BTS DTT
478379BR BTS DTT
478383BR BTS DTT
478389BR BTS DTT
478395BR BTS DTT
478253BR BTS DTT
479026BR BTS DTT
478434BR BTS DTT
478438BR BTS DTT
478444BR BTS DTT
478447BR BTS DTT
478462BR BTS DTT
478465BR BTS DTT
478470BR BTS DTT
478475BR BTS DTT
478477BR BTS DTT
478259BR BTS DTT
478278BR BTS DTT
479037BR BTS DTT
478310BR BTS DTT
478481BR BTS DTT
478484BR BTS DTT
478491BR BTS DTT
478496BR BTS DTT
478905BR BTS DTT
478906BR BTS DTT
478909BR BTS DTT
478910BR BTS DTT
478911BR BTS DTT
478913BR BTS DTT
478914BR BTS DTT
478916BR BTS DTT
478917BR BTS DTT
487942BR BTS DTT
487914BR BTS DTT
487919BR BTS DTT
487924BR BTS DTT
487926BR BTS DTT
487881BR BTS DTT
487885BR BTS DTT
487888BR BTS DTT
487891BR BTS DTT
487895BR BTS DTT
487940BR BTS DTT
487943BR BTS DTT
487860BR BTS DTT
487912BR BTS DTT
487921BR BTS DTT
487861BR BTS DTT
487865BR BTS DTT
487870BR BTS DTT
487873BR BTS DTT
487886BR BTS DTT
487951BR BTS DTT
487954BR BTS DTT
487929BR BTS DTT
487930BR BTS DTT
487932BR BTS DTT
487936BR BTS DTT
487938BR BTS DTT
487897BR BTS DTT
487901BR BTS DTT
487905BR BTS DTT
487908BR BTS DTT
487911BR BTS DTT
487945BR BTS DTT
487948BR BTS DTT
487866BR BTS DTT
487928BR BTS DTT
487931BR BTS DTT
487887BR BTS DTT
487889BR BTS DTT
487894BR BTS DTT
487900BR BTS DTT
487906BR BTS DTT
479699BR BTS DTT
395244BR BTS DTT
462451BR BTS DTT
462424BR BTS DTT
436671BR BTS DTT
437398BR BTS DTT
445853BR BTS DTT
413506BR BTS DTT
476467BR BTS DTT
445425BR BTS DTT
346387BR BTS DTT
JR: S
Application Architect-Enterprise Content Management
Data Engineer-Data Modeling
Application Architect-Data Platforms
Data Engineer-Big Data
Application Architect-Data Platforms
Data Engineer-Master Data Management
Data Engineer-Enterprise Content Management
Application Architect-Data Platforms
Data Engineer-Master Data Management
Data Engineer-Big Data
Application Architect-Data Platforms
Data Consultant-Data Governance
Application Architect-Data Platforms
Application Architect-Data Platforms
Application Architect-Data Platforms
Application Architect-Data Platforms
Data Engineer-Enterprise Content Management
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Modeling
Application Architect-Data Platforms
Data Engineer-Data Modeling
Data Consultant-Data Governance
Data Engineer-Master Data Management
Data Engineer-Master Data Management
Data Engineer-Master Data Management
Data Engineer-Data Integration
Data Engineer-Master Data Management
Data Engineer-Data Modeling
Data Engineer-Business Intelligence
Data Engineer-Big Data
Application Architect-Data Platforms
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Master Data Management
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Application Architect-Data Platforms
Data Engineer-Big Data
Data Consultant-Data Governance
Data Consultant-Data Governance
Data Engineer-Data Modeling
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Integration
Application Architect-Data Platforms
Application Architect-Data Platforms
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Business Intelligence
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Business Intelligence
Data Consultant-Data Governance
Data Consultant-Data Governance
Data Engineer-Enterprise Content Management
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Business Intelligence
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Warehouse
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Enterprise Content Management
Data Engineer-Business Intelligence
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Warehouse
Data Engineer-Data Warehouse
Application Architect-Data Platforms
Application Architect-Data Platforms
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Warehouse
Data Engineer-Big Data
Data Engineer-Business Intelligence
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Warehouse
Data Engineer-Data Integration
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Business Intelligence
Data Engineer-Data Integration
Application Architect-Data Platforms
Data Engineer-Big Data
Project Manager-Data Platforms
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Project Manager-Data Platforms
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Integration
Project Manager-Data Platforms
Data Engineer-Enterprise Content Management
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Enterprise Content Management
Data Engineer-Big Data
Application Architect-Data Platforms
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Business Intelligence
Data Engineer-Big Data
Data Engineer-Business Intelligence
Data Engineer-Data Integration
Project Manager-Data Platforms
Data Engineer-Enterprise Content Management
Data Engineer-Data Integration
Data Engineer-Big Data
Data Engineer-Master Data Management
Application Architect-Data Platforms
Application Architect-Data Platforms
Data Engineer-Big Data
Data Engineer-Business Intelligence
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Integration
Data Engineer-Data Integration
Application Architect-Data Platforms
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Enterprise Content Management
Data Engineer-Big Data
Application Architect-Data Platforms
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Application Architect-Data Platforms
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Enterprise Content Management
Data Engineer-Data Integration
Data Engineer-Big Data
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Business Intelligence
Application Architect-Data Platforms
Application Architect-Data Platforms
Application Architect-Data Platforms
Application Architect-Data Platforms
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Big Data
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Data Integration
Data Engineer-Enterprise Content Management
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Integration
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Enterprise Content Management
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Consultant-Data Governance
Data Consultant-Data Governance
Data Consultant-Data Governance
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Business Intelligence
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Data Modeling
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Consultant-Data Governance
Data Consultant-Data Governance
Data Consultant-Data Governance
Data Consultant-Data Governance
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Big Data
Data Engineer-Master Data Management
Data Engineer-Data Integration
Data Engineer-Data Integration
Application Architect-Data Platforms
Application Architect-Data Platforms
Application Architect-Data Platforms
Application Architect-Data Platforms
Application Architect-Data Platforms
Application Architect-Data Platforms
Application Architect-Data Platforms
Application Architect-Data Platforms
Application Architect-Data Platforms
Application Architect-Data Platforms
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Integration
Data Consultant-Data Governance
Data Engineer-Business Intelligence
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Warehouse
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Warehouse
Data Consultant-Data Governance
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Modeling
Data Engineer-Enterprise Content Management
Application Architect-Data Platforms
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Master Data Management
Data Engineer-Master Data Management
Data Engineer-Data Modeling
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Data Modeling
Data Engineer-Enterprise Content Management
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Enterprise Content Management
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Master Data Management
Data Engineer-Master Data Management
Data Engineer-Data Integration
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Warehouse
Data Engineer-Big Data
Data Engineer-Data Warehouse
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Application Architect-Data Platforms
Application Architect-Data Platforms
Data Engineer-Business Intelligence
Data Engineer-Business Intelligence
Data Engineer-Big Data
Application Architect-Data Platforms
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Big Data
Data Engineer-Data Warehouse
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Business Intelligence
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Business intelligence
Data Engineer-Enterprise Content Management
Data Engineer-Business intelligence
Data Engineer-Enterprise Content Management
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Data Modeling
Data Engineer-Business Intelligence
Data Engineer-Business Intelligence
Data Engineer-Business Intelligence
Data Engineer-Business Intelligence
Application Architect-Data Platforms
Application Architect-Data Platforms
Data Engineer-Data Modeling
Data Engineer-Business Intelligence
Data Engineer-Business Intelligence
Data Engineer-Business Intelligence
Data Engineer-Business Intelligence
Data Engineer-Data Integration
Data Engineer-Big Data
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Integration
Application Architect-Data Platforms
Application Architect-Data Platforms
Data Engineer-Data Integration
Data Engineer-Enterprise Content Management
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Business Intelligence
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Application Architect-Data Platforms
Application Architect-Data Platforms
Application Architect-Data Platforms
Application Architect-Data Platforms
Application Architect-Data Platforms
Application Architect-Data Platforms
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Data Integration
Project Manager-Data Platforms
Data Engineer-Business Intelligence
Application Architect-Data Platforms
Application Architect-Data Platforms
Data Engineer-Business Intelligence
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Integration
Data Engineer-Enterprise Content Management
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Data Engineer-Business Intelligence
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Business Intelligence
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Application Architect-Data Platforms
Data Engineer-Business Intelligence
Data Engineer-Business Intelligence
Data Engineer-Data Modeling
Data Engineer-Data Modeling
Project Manager-Data Platforms
Project Manager-Data Platforms
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Modeling
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Master Data Management
Data Engineer-Master Data Management
Data Engineer-Data Integration
Data Engineer-Data Warehouse
Data Engineer-Data Warehouse
Data Engineer-Data Warehouse
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Data Warehouse
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Business Intelligence
Data Engineer-Master Data Management
Data Engineer-Data Warehouse
Data Engineer-Data Modeling
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Enterprise Content Management
Data Engineer-Enterprise Content Management
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Data Warehouse
Project Manager-Data Platforms
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Master Data Management
Data Engineer-Master Data Management
Data Engineer-Master Data Management
Data Engineer-Data Integration
Data Engineer-Data Warehouse
Data Engineer-Data Warehouse
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Business Intelligence
Data Engineer-Business Intelligence
Data Engineer-Business Intelligence
Data Engineer-Business Intelligence
Data Engineer-Business Intelligence
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Enterprise Content Management
Data Engineer-Data Integration
Data Engineer-Business Intelligence
Application Architect-Data Platforms
Application Architect-Data Platforms
Application Architect-Data Platforms
Application Architect-Data Platforms
Application Architect-Data Platforms
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Business Intelligence
Data Engineer-Business Intelligence
Data Engineer-Business Intelligence
Data Engineer-Business Intelligence
Data Engineer-Business Intelligence
Data Engineer-Business Intelligence
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Integration
Application Architect-Data Platforms
Application Architect-Data Platforms
Application Architect-Data Platforms
Application Architect-Data Platforms
Application Architect-Data Platforms
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Big data
Data Engineer-Big data
Data Engineer-Big data
Data Engineer-Big data
Data Engineer-Big data
Data Engineer-Big data
Data Engineer-Big data
Data Engineer-Big data
Data Engineer-Big data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Big Data
Data Engineer-Big data
Data Engineer-Big data
Data Engineer-Big data
Data Engineer-Big data
Data Engineer-Big data
Data Engineer-Big data
Data Engineer-Big data
Data Engineer-Big data
Data Engineer-Big data
Data Engineer-Big data
Data Engineer-Big data
Data Engineer-Big data
Data Engineer-Big data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Data Integration
Application Architect-Data Platforms
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Data Integration
Data Engineer-Data Integration
Application Architect-Data Platforms
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Data Integration
Data Engineer-Business Intelligence
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Big Data
Data Engineer-Business Intelligence
Data Engineer-Data Integration
Data Engineer-Data Modeling
Job Description Years of experiencePrimary Location
tions which manages non-ERP document management requirements.Document Management
6 - 8overall
years architecture.
Hyderabad
Document Migration design and migrat
Modeling in DataWarehouse Environment Demonstrable experience using data modeling2tools
- 4 years
- e.g., ErWinBangalore
Evaluate existing data models and physic
8 - 10 years Bangalore
ing with end users. A detailed understanding and experience of development best practices
8 -in10
below
yearsareas: Hyderabad
Continuous Integration, branching and merg

· AWS Certified,
Proven working EKS,
experience in Aviatrix,
MDM-RDM Artefactory, HDP, data
space (Master Terraform
and Reference 8data)
- 10 years Bangalore
· Must be open to learn New technologies 4 - 6 years Any
Documentum P2 2 - 4 years Kolkata
Python, AWS, Mongo DB, Document DB, Big Data 8 - 10 years Bangalore
Stibo STEP MDM. Knowledge on AWS cloud will be appreciated.Resource should have good
4 - 6 years
communication,
Bangalore
work experinece in AMS support Model.
ing with end users. A detailed understanding and experience of development best practices
8 -in10
below
yearsareas: Hyderabad
Continuous Integration, branching and merg
e + Snowflake+ Apache Kafka + Azure DataBricks + HDInsights / OpenShift / Azure Kubenetes
8 - 10 years
Services Bangalore
/InfoSphere IGC / Denodo / Talend Data Fabric / IBM Data Virtualisation / Watson Knowledge
6 - 8 years
Catalog / IDA,
Bangalore
Erwin
Open Source - Cloudera, Hortonworks 8 - 10 years Bangalore
IT Data SME 4 - 6 years Bangalore
ess Architecture Experience with object-oriented/object function scripting languages: Python,
6 - 8SQL,
yearsScala, Spark-SQL
Bangalore etc. Experience with Devops. Stro
n implementing ETL using Pyspark on Azure Databricks. Must have strong knowledge of
6 -Software
8 years Development
Bangalore
Life Cycle including requirement ana
ty, Folder Structure, TeamSpace, Choice lists, Entry Templates
5. Snowflake . and Search Templates etc.,2- 4Hands-on
years experience
Mumbai
in ICN Plugins and External Data Serv

5. 6. Snowflake
Python . 4 - 6 years Bangalore
6. Python 4 - 6 years Bangalore
ectional
ticket leads
raising?
on They saidduring
projects time is a critical factor,
requirements that we need
development to findconversations
and guide a suitable resource
based offshore
12+
on years
best asap. Thesolution
technical candidate
Bangalore must have good communication sk
available.
ing new technologies/capabilities and advise strategically about how technology and tool capabilities
6 - 8 years can be leveraged
Pune to create Analytic and Business
e ticket raising? They said time is a critical factor, that we need to find a suitable resource offshore
12+ years
asap. The candidate
Bangaloremust have good communication sk
ory (Infosphere
· Governance
Proven working Catalog) Information
experience in MDM-RDMManagement Strategy
space (Master Design
data Co-ordinate
and Reference 2 - with
4 years
data) Multiple US based
Pune Business Stakeholders Exposure to r

· · experience
Proven working Must beinopen to learn New
MDM-RDM spacetechnologies 4 - 6 years
(Master data and Reference data) Any

· · experience
Proven working Must beinopen to learn New
MDM-RDM spacetechnologies 6 - 8 years
(Master data and Reference data) Any
· Must be open to learn New technologies 4 - 6 years Any
egration
· plusProven
Application Integration)
working experience, ODI, ,Informatica
in MDM-RDM Powercenter,
space Informatica
(Master data Powercenter
and Reference 2 - 4 years
data) exchange ,Informatica
Bangalore Data Quality. Database
· Must be open to learn New technologies 6 - 8 years Any
ol Conduct logical data analysis and data modeling joint application design (JAD) sessions,
8 -documented
10 years data-related
Kolkatastandards Define data modeling and
Chain data visualization Should have exposure in R/Python. Good communication - Effectively
8 - 10 communicate
years &
Kolkata
interact with internal and external clients
Data engineering,
pplication landscape based on data analysis, and
requirements Azure, Hadoop,
code NoSQL,
walk through Spark,
and SQL
map to Server
future state6architecture
- 8 years on GCP Bangalore
is required.
nce leading a team, set up projects in GCP and define process for team to implement future8 state
- 10 years
architecture. Bangalore
ases. Qualification and Skills Experience of APIS, microservices, Kubernetes and data ingestion
8 - 10 years
(Airflow) and
Bangalore
ETL patterns Experience of developing d
ases. Qualification and Skills Experience of APIS, microservices, Kubernetes and data ingestion
6 - 8 years
(Airflow) and
Bangalore
ETL patterns Experience of developing d
e client master data. Skills include client information architecture, MDM tools including WebSphere
4 - 6 yearsCustomerBangalore
Center (DWL), CIIS, Siperion, etc. Experie
ol Conduct logical data analysis and data modeling joint application design (JAD) sessions,
8 -documented
10 years data-related
Kolkatastandards Define data modeling and

dsource. Advanced
managing Knowledge
an organization's on architecture
data Relational Database Systems
in the cloud (Big and Big
Data, data design
analytics, framework
2 - 4analysis,
information years
is required to translate
data lakesPune the source
and database data modelsolutions)
management and reverse
Experience of architecting of data ingestion ETL solutions in the cloud (Airflow)6 - 8 years Bangalore
ases. Qualification and Skills Experience of APIS, microservices, Kubernetes and data ingestion
6 - 8 years
(Airflow) and
Bangalore
ETL patterns Experience of developing d
APIExperience
technologies Experience data
in developing in developing
managementmasterdata capabilities,
capabilities data standardization,
or data warehousing and
6 - 8quality
is an advantage years development
Bangalore
Good team leadership and interaction s
Experience in regulatory requirements and regulation in the fi 8 - 10 years Bangalore
tic) Experience
2. Writing PL/SQL jobsscalable
in building to run load from staging
end-to-end to dimensional
data ingestion model 3.solutions
and processing Excellent_x000D_
communication
6 - 8 years skills 4.Pune
Ability to work independently as well as
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
8 - 10
andyears
Scala" Chennai
erience with object-oriented and/or functional programming languages, such as Python, Java8 - 10
andyears
Scala" Chennai
Mandatory): BankingDomain Experience, Scheduling tool like Autosys, Control-M, IBM 4Tivoli
- 6 years
experience. Hands
Puneon experience on any ETL tool. Incas
era based Data lake with strong Microsoft understanding and future goal of Cloudera on Azure.
8 - 10 years
BigData Skills
Bangalore
and understanding: Hadoop, Scala, SQL
era based Data lake with strong Microsoft understanding and future goal of Cloudera on Azure.
8 - 10 years
BigData Skills
Bangalore
and understanding: Hadoop, Scala, SQL
Development experience in Hadoop and Spark component knowledge 2 - 4 years Bangalore
Big Data technologies including API development. Expected to have traditional Application6 Development
- 8 years background
Hyderabadalong with knowledge of Analytics
era based Data lake with strong Microsoft understanding and future goal of Cloudera on Azure.
6 - 8 years
BigData Skills
Bangalore
and understanding: Hadoop, Scala, SQL
era based Data lake with strong Microsoft understanding and future goal of Cloudera on Azure.
6 - 8 years
BigData Skills
Bangalore
and understanding: Hadoop, Scala, SQL
BI-Azure 8 - 10 years Kolkata
0 4 - 6 years Mumbai
gration withETL
dataInformatica
sources (Mainframe
developerfiles,
mustRDBMS, SFTP, Kafka)
have the following - 3 to 5Experience with AWS
years experience. 6 - 8services
years like S3,Bangalore
DataSync, RDS. (Good to have) Informatic
Nice to have -- Experience on SAP Data Conversion in the areas SAP FICO, SAP MM, 2SAP
- 4 years
SD Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
n OLTP and OLAP models Experience working on Tableau 9.x (Desktop, Server, Reader),4 creating
- 6 yearsvarious Reports
Any and Dashboards using different funct
adata solution) Metadata management development (platform, glossaries, data dictionary,8 data
- 10 catalog,
years Lineage,
Bangalore
data models) Data architecture or soluti
ed to information management Experience in developing masterdata capabilities, data standardization
8 - 10 years and quality
Bangalore
development is beneficial Experience i
OD / Mainframe Developer with 4_+ years of relevent experience in CMOD. Good Communication
4 - 6 yearsskills. Pune
ands-on Bitbucket/Git experience,Experience in software development,Apache Spark experience
8 - 10 years
is a bonus,Big
Hyderabad
Data experience is a bonus,Pythonic code
ands-on Bitbucket/Git experience,Experience in software
6. COBOL knowledge development,Apache
is added advantage. Spark experience
8 - 10 years
is a bonus,Big
Hyderabad
Data experience is a bonus,Pythonic code
7. Good
on Big Data technologies .  using communication
PySpark. Strong and writing
technical skills
abilities 4 -write
to understand, design, 6 years Pune code
and debug complex
Mandate skills - Pyspark, Kubernates (desirable) 6 - 8 years Bangalore
gration with data sources (Mainframe files, RDBMS, SFTP, Kafka) Experience with AWS
2 - 4services
years like S3,Bangalore
DataSync, RDS. (Good to have) Informatic
gration with data sources (Mainframe files, RDBMS, SFTP, Kafka) Experience with AWS
2 - 4services
years like S3,Bangalore
DataSync, RDS. (Good to have) Informatic
rnance, data quality, data preparation, or data architecture Experience in Informatica (Informatica
4 - 6 years
Enterprise Data
Kolkata
Catalog) Prior experience in all stages
Offshore Informatica Cloud & AWS 2 - 4 years Kolkata
Databricks / Snowflake / Cassandra/ Teradata 4 - 6 years Bangalore
gration withETL
dataInformatica
sources (Mainframe
developerfiles,
mustRDBMS, SFTP, Kafka)
have the following - 3 to 5Experience with AWS
years experience. 6 - 8services
years like S3,Bangalore
DataSync, RDS. (Good to have) Informatic
Nice to have --Have
Experience on SAP
a very good Data
grasp on Conversion in the
SQL to be able to areas
extractSAP
and FICO,
analyzeSAP
dataMM, 2SAP
- 4 years
SD Bangalore
Has a good understanding of Entity relationships and can understand the associated business
2 -process.
4 years Pune
OD / Mainframe Developer with 4_+ years of relevent experience in CMOD. Good Communication
4 - 6 yearsskills. Pune
gn
on kit.
Big Includes integration.  using
Data technologies between Business Intelligence
PySpark. Strong Platforms,
technical which
abilities to will allow
understand, users
6 -write
design, to
8 years
toggle easily Hyderabad
and debug between reporting
complex code and analysis tasks. The re
Mandate skills - Pyspark, Kubernates (desirable) 4 - 6 years Hyderabad
um 3 to 4 years of experience with Enterprise ETL Platforms and processes 3+ years of experience
6 - 8 yearswith SQL development,
Kolkata Stored Procedures 2 years of
ess. Resolve bugs within defined service levels. Manage and own application Bugs and incident
2 - 4 years
life cycle, andGurgaon
drive those to permanent resolutions, imple
Looking for an Abinito resource who can build data pipelines & handle any new data sourcing
6 - 8needs
years NCR
ETL Tools / Python/Kafka / Spark 6 - 8 years Bangalore
ETL Tools / Python/Kafka / Spark 6 - 8 years Bangalore
Experience in Data Migration from RDBMS to Snowflake cloud data warehouse.12+ years Hyderabad
ing with end users. A detailed understanding and experience of development best practices12+
in below
years areas: Hyderabad
Continuous Integration, branching and merg
ing with end users. A detailed understanding and experience of development best practices
8 -in10
below
yearsareas: Hyderabad
Continuous Integration, branching and merg
Tableau 6 - 8 years Hyderabad
Hands-on with any scripting skills is preferred(Python/Spark) 6 - 8 years Hyderabad
Experience in ETL development & deployment using IBM DataStage 6 - 8 years Hyderabad
Knowledge on AWS 6 - 8 years Hyderabad
Knowledge on dbt (data build tool)""" 6 - 8 years Hyderabad
rt, Fast Load, MultiLoad, TPump and TPT), Oracle 8.1 with ETL knowledge Good to have6(Not
- 8 years
Mandatory):Hyderabad
Informatica, SQL,UNIX/Linux Shell Scripti
FS using Hive and Spark. Experience with Object Oriented Programming using Python and
4 -its
6 years
design patterns.Chennai
Experience handling Unix systems, for op
erience with Cognos Framework Manager and Cognos Transformer is needed. Should have4 hands
- 6 years
on experience
Hyderabad
on SQL queries and worked on debuggin
Tableau 4 - 6 years Hyderabad

Data scienceTableau
workflow exposure 6 - 8 years Hyderabad
Cloud pipelines (AWS or Google Cloud) 6 - 8 years Hyderabad
rt, Fast Load, MultiLoad, TPump and TPT), Oracle 8.1 with ETL knowledge Good to have6(Not
- 8 years
Mandatory):Hyderabad
Informatica, SQL,UNIX/Linux Shell Scripti
Must have: Strong hands-on Ab Initio experience, along with Unix and Sql 4 - 6 years Bangalore
FS using Hive and Spark. Experience with Object Oriented Programming using Python and
4 -its
6 years
design patterns.Chennai
Experience handling Unix systems, for op
ted solutions based on Informatica product. Experience extracting data from a variety of sources,
4 - 6 years
and a desire Chennai
to expand those skills (working knowledge
erience with Cognos Framework Manager and Cognos Transformer is needed. Should have4 hands
- 6 years
on experience
Hyderabad
on SQL queries and worked on debuggin
4 - 6 years Bangalore
d hands-on expertise in creating data solutions for client, creating best of breed end-to-end8solutions
- 10 yearsleveraging
Bangalore
Cloud and traditional data platform offerin
evelopment background along with knowledge of Analytics libraries, open-source Natural 2Language
- 4 years Processing,
Bangalore
statistical and big data computing librarie
ements, including IBM and Client employees and 3rd party vendors. Requires experience with
6 - 8PM
years
methodologies.
Bangalore
Skill include Big Data, Analytics, Busi
evelopment background along with knowledge of Analytics libraries, open-source Natural 2Language
- 4 years Processing,
Bangalore
statistical and big data computing librarie
ge (formerly Ascential) - IBM's WebSphere Data Integration Suite. Skills include designing2 and
- 4 years
developing extract,
Bangalore
transform and load (ETL) processes. E
d in databases. Implementation of one conceptual data model may require multiple logical 4data
- 6 models.
years The last
Bangalore
step in data modeling is transforming the
d in databases. Implementation of one conceptual data model may require multiple logical 6data
- 8 models.
years The last
Bangalore
step in data modeling is transforming the
ements, including IBM and Client employees and 3rd party vendors. Requires experience with
6 - 8PM
years
methodologies.
Bangalore
Skill include Big Data, Analytics, Busi
park and DBT, Data Vault/Data Warehouse experience, Strong knowledge of data in general,
8 - 10Strong
years understanding
Bangalore
of Java/Python,
park and DBT, Data Vault/Data Warehouse experience, Strong knowledge of data in general,
6 - 8Strong
years understanding
Bangalore
of Java/Python,

7. Promote innovation in team and able to generate innovative approach to any given problem4 - 6 years Bangalore
nderstanding of retail domain is preferred, ability to discuss with client business people in 4their
- 6 years
language Bangalore
ation, Kofax KIC folder import, email, fax, web service, KAFC - Kofax Analytics for Capture,
2 - 4 Kofax
years Export Bangalore
Script Customization and ), Kofax KAPOW/
WhereEscape, SnowFlake, AWS 4 - 6 years Bangalore
4 - 6 years Bangalore
8 - 10 years Bangalore
8 - 10 years Bangalore
4 - 6 years Bangalore
4 - 6 years Bangalore
Filenet Developer 2 - 4 years Hyderabad
Big Data technologies including API development. Expected to have traditional Application6 Development
- 8 years background
Hyderabadalong with knowledge of Analytics
performance tuning Strong experience in one of programming languages (Scala, Java) Familiarity
8 - 10 years
with development
Hyderabad
tools (experience on either IntelliJ / E
ment of ETL development work--Design, develop, test, implement and troubleshoot ETL mappings in a large Data Warehouse environment
code) skills
through using PowerEnterprise
Informatica’s Center. Experience in Data
Data Catalog (EDC)Warehouse applications,
or Microsoft Purview orOracle PL/SQL
Collibra 6 -curating
for 8and
years
UNIX shell and
the Data scripts--Build
Any shell scripts,
Analytics Metadata andOracle
define packages
a proces
e process for all ongoing and future developments to ensure that the Business and Technical4 Metadata
- 6 years definitionsPune
remains current on the Enterprise Data C
Power BI, Tableau, Cognos, MS Excel (Expert) SQL, Spark Teradata and Oracle PL/SQL
2 - 4 years Bangalore
park and DBT, Data Vault/Data Warehouse experience, Strong knowledge of data in general,
6 - 8Strong
years understanding
Bangalore
of Java/Python,
technical specification documents from business requirements Problem-solving skills o4Proficiency
- 6 years in designing
Bangalore
and developing Dashboards & Scorec
forms. The roleinnovation
7. Promote requires Oracle database
in team knowledge
and able to generateandinnovative
cloud platform knowledge
approach to be problem
to any given able
4 - 6toyears
troubleshootBangalore
the performance issues & bottlenecks, take n
nderstanding of retail domain is preferred, ability to discuss with client business people in 4their
- 6 years
language Bangalore
ation, Kofax KIC folder import, email, fax, web service, KAFC - Kofax Analytics for Capture,
2 - 4 Kofax
years Export Bangalore
Script Customization and ), Kofax KAPOW/
WhereEscape, SnowFlake, AWS 4 - 6 years Bangalore

· In depthexperience
Proven working knowledgeininMDM-RDM
Azure Data bricks and datadata
space (Master analysis. 2 - 4 years
and Reference data) Bangalore
· Must be open to learn New technologies 2 - 4 years Any
GCP - Hadoop / Solution Architect / Tech SME 12+ years Any

pplication landscape basedGCP - Hadoop / Solution


on requirements and codeArchitect / Techand
walk through SMEmap to future state12+ years on GCP Any
architecture is required.
nce leading a team, set up projects in GCP and define process for team to implement future8 state
- 10 years
architecture. Bangalore
hniques, and customizing out of the box widgets.Hands on expertise on Qlik scripting, data1source
- 3 years
integration, report's
Pune performance optimization, and adv
ears of Experience in Designing and Data Modeling of Data Analytics Systems using Relational
4 - 6 years
and Non Relational
Bangalore
databases Mongo DB
ears of Experience in Designing and Data Modeling of Data Analytics Systems using Relational
4 - 6 years
and Non Relational
Bangalore
databases Mongo DB
om on-prem to cloud. Need people who have very good understanding of data warehouses,4ETL
- 6 years
Informatica,and Any
SQL server databases. They need to know
be hands-on, know components of Informatica, IICS is a must have, and also guide the team
6 -members
8 years with required
Bangalore
technical expertise. Testing the Inform
Cloudera Data Platform installation, upgrade, configuration, optimization, and administration
12+ years Bangalore
Powercentre 9.5 version , worked on 10 Version is add on. Worked on One automic Scheduler
2 - 4 years
/Control m scheduler
Gurgaonis Add on. Ensure that the applicati
centre 9.5 version , worked on 10 Version is add on. Worked on One automic Scheduler /Control
4 - 6 years
m scheduler Gurgaon
is Add on. Ensure that the application is run
HP Extreme skills 6 - 8 years Pune
of Big Data (HDP and HDF) Kafka and NiFi/MiNiFi Hands-on coding in Python for Apache
6 - 8Spark
years(PySpark and
Any
Spark Streaming) Hive, Hbase and Pho
t Design end to end Data Architectures
 Automatic taking both specific
Document business
Numbering needs and our enterprise
(ADN) 8 - 10blueprints
years in toBangalore
account to make sure we have efficient reu
 Advance State Management (ASM) 6 - 8 years Ahmedabad
Experience with OpenText Vendor Invoice Management (VIM) 8 - 10 years Mumbai
Cloudera Data Platform installation, upgrade, configuration, optimization, and administration
12+ years Bangalore
Powercentre 9.5 version , worked on 10 Version is add on. Worked on One automic Scheduler
2 - 4 years
/Control m scheduler
Gurgaonis Add on. Ensure that the applicati
centre 9.5 version , worked on 10 Version is add on. Worked on One automic Scheduler /Control
4 - 6 years
m scheduler Gurgaon
is Add on. Ensure that the application is run
nd DWH SQL Basic Java Code versioning (Git/BitBucket) Jenkins Any scheduling tool , Automic
4 - 6 years
will be perfect
Any(but not mandatory)
HP Extreme skills 6 - 8 years Pune
pting, Linux 3. Data Analysis and if possible masking experience 4. Good in SQL 5. RDMS
4 -concepts,
6 years Oracle, DB2
Puneon Z, MSSQL etc 6. Data related expe
of Big Data (HDP and HDF)
 Kafka and NiFi/MiNiFi
Automatic Document Hands-on coding
Numbering (ADN)in Python for Apache
8 - 10
Spark
years(PySparkHyderabad
and Spark Streaming) Hive, Hbase and Pho
 Advance State Management (ASM) 6 - 8 years Ahmedabad
Experience with OpenText Vendor Invoice Management (VIM) 8 - 10 years Mumbai
Chain data visualization Should have exposure in R/Python.
c: Leadership qualityGood communication - Effectively
4 - 6 years
communicateBangalore
& interact with internal and external clients
Educational Qualification : 15 years
c: Leadership of full time education
quality 6 - 8 years Gurgaon
Educational Qualification : 15 years
c: Leadership of full time education
quality 4 - 6 years Gurgaon
Educational Qualification : 15 years
c: Leadership of full time education
quality 4 - 6 years Gurgaon
Educational
 Good knowledge Qualification
in building process: to
15build
yearsdata
of full time education
lineage, 6 - 8 years
Metadata, data catalogue Gurgaon
Ability to clearly communicate complex technical ideas, regardless of the technical capacity6of- 8the
years
audience Bangalore
 Good knowledge in building process to build data lineage, Metadata, data catalogue
Ability to clearly communicate complex technical ideas, regardless of the technical capacity6of
- 8the
years
audience Bangalore
Azure Data Factory, Python 1 - 3 years Any
owledge of the existing Cordys
- Hands-on BOP, associated
experience technologies
working with databasesand
likestandards is highly
Oracle, SQL etc. desirable.
1 - 3 years
ExperienceBangalore
in Entity Modeling & Business Process Mod
- Knowledge of Extream xECM is a plus 1 - 3 years Bangalore
ge in areas SAS development experience
Strong with SAS
interpersonal andBase, SAS DI, SAS
Communication Enterprise Guide and
skills 2 - must
4 years
have SASHyderabad
ETL development skills along with Python p
Ability to work well in a team oriented environment. 4 - 6 years Hyderabad
yrs in Data Engineer-Data Modeling - UML, database modelling, Start scheme, SQL. Good4 implementation
- 6 years skills
Bangalore
and communication skills.
yrs in Data Engineer-Data Modeling - UML, database modelling, Start scheme, SQL. Good4 implementation
- 6 years skills
Bangalore
and communication skills.

pplications,OLTP & OLAPtuning,


performance DWHtroubleshooting,
using Talend (mandatory)
maintains and Pentaho
security, and(preferable) Suite6 procedures
automates routine - 8 years through
Gurgaon
scripting.
Strong analytical, problem solving, communication and presentation skills. 4 - 6 years Bangalore
ation/reporting tools like Tableau
Strong Role & Responsibility
interpersonal (What exactly
and Communication this resource will be
skills 4 -doing):
6 yearsThe resource
Bangalore
will be providing production support to
Ability to work well in a team oriented environment. 4 - 6 years Hyderabad
sources Experience with Data Marts, Data Warehouse structures (e.g., star schema, fact and
2 -dimensions)
4 years 3+ years
Kolkata
Database Experience including Oracle
and Shellscripting. - Strong on DW Fundamentals, concepts, understands and review technical
6 - 8 requirements,
years Hyderabad
designs ETL process flow and walkthroughs
OLTP & OLAP DWH using Talend (mandatory) and Pentaho (preferable) Suite6 - 8 years Gurgaon
Scala Spark, Apache Griffin with Core Java 4 - 6 years Bangalore
Scala
certifications added advantage: Spark,Certified
Splunk Apache Admin,
Griffin with Core
Splunk Java Architect, Splunk
Certified 4 - Certified
6 years Consultant
Bangalore
•Identify innovative ways to improve the process of delivering solutions to clients.
6 - 8 years Hyderabad
s and development, testing Data archive exp from various data sources like SQL , Oracle ,6JDEdwards
- 8 years , Microsoft
Bangalore
GP Should have exp on Native applicat
s and development, testing Data archive exp from various data sources like SQL , Oracle ,6JDEdwards
- 8 years , Microsoft
Bangalore
GP Should have exp on Native applicat
s and development, testing Data archive exp from various data sources like SQL , Oracle ,6JDEdwards
- 8 years , Microsoft
Bangalore
GP Should have exp on Native applicat
infrastructure required for optimal extraction, transformation, and loading data from a wide4 variety
- 6 yearsof data sources
Bangalore
using SQL and GCP Big Data technolo
Big Data technologies including API development. Expected to have traditional Application4 Development
- 6 years background
Hyderabadalong with knowledge of Analytics
M Case Manager and Content Navigator, basic knowledge of Java/Javascript Hands-on experience
6 - 8 years
with administrator
Hyderabad tools such as ACCE, knowledge on
ontent Navigator Configuration and Customization. In addition, it is required to have hands-on
6 - 8 years
experience with
Hyderabad
administrator tools such as ACCE, knowl
/ Case Manager and some Java/React knowledge is good to have.Good Communication skill
4 -and
6 years
also workedBangalore
in Agile environment.
istrative tools such as ACCE, ICM Admin, ICN Admin Working knowledge of deployments
4 - processes
6 years of FileNet
Hyderabad
artefacts and code promotion Good at p
. Must be aware of good practices related to Java development, code maintenance. Basic knowledge
4 - 6 yearson deployment
Hyderabad
process Knowledge on DEVOPS and
m multiplelandscape
pplication sources such as Teradata
based and Hive
on requirements into
and Qlikview
code applications.
walk through and mapCreate performance
to future state4architecture
- 6efficient
years data models
Hyderabad
on GCP and dashboards. Exposure in JavaScri
is required.
nce leading a team, set up projects in GCP and define process for team to implement future8 state
- 10 years
architecture. Bangalore
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
1 - 3ofyears
other Ab InitioPune
products would be an added advantage)
the Data Model is up to date at all times reflecting the changes which have been made into6 the
- 8 years
source systems. Delhi
The resource would be required to synchr
Scala Spark, Apache Griffin with Core Java 4 - 6 years Bangalore
Scala
certifications added advantage: Spark,Certified
Splunk Apache Admin,
Griffin with Core
Splunk Java Architect, Splunk
Certified 4 - Certified
6 years Consultant
Bangalore
•Identify innovative ways to improve the process of delivering solutions to clients.
6 - 8 years Hyderabad
s and development, testing Data archive exp from various data sources like SQL , Oracle ,4JDEdwards
- 6 years , Microsoft
Bangalore
GP Should have exp on Native applicat
s and development, testing Data archive exp from various data sources like SQL , Oracle ,6JDEdwards
- 8 years , Microsoft
Bangalore
GP Should have exp on Native applicat
s and development, testing Data archive exp from various data sources like SQL , Oracle ,6JDEdwards
- 8 years , Microsoft
Bangalore
GP Should have exp on Native applicat
s and development, testing Data archive exp from various data sources like SQL , Oracle ,6JDEdwards
- 8 years , Microsoft
Bangalore
GP Should have exp on Native applicat
. The effort will be more on Pyspark development. Min. 2+ years of Big Data experience. Primary
2 - 4 years
Skill: Spark,
Hyderabad
Hadoop, Hive (Pyspark epxertise is prefer
infrastructure required for optimal extraction, transformation, and loading data from a wide4 variety
- 6 yearsof data sources
Bangalore
using SQL and GCP Big Data technolo
infrastructure required for optimal extraction, transformation, and loading data from a wide4 variety
- 6 yearsof data sources
Bangalore
using SQL and GCP Big Data technolo
infrastructure required for optimal extraction, transformation, and loading data from a wide4 variety
- 6 yearsof data sources
Bangalore
using SQL and GCP Big Data technolo
Big Data technologies including API development. Expected to have traditional Application4 Development
- 6 years background
Hyderabadalong with knowledge of Analytics
ing and developing extract, transform and load (ETL) processes. Experience includes full4lifecycle
- 6 yearsimplementation
Puneof the technical components of a busi
pporting Filenet projects, Key skills which includes Java / J2EE programming using FileNet
6 - 8P8years
CE, PE, WAT
Bangalore
APIs and running Database SQL queries P
M Case Manager and Content Navigator, basic knowledge of Java/Javascript Hands-on experience
6 - 8 years
with administrator
Hyderabad tools such as ACCE, knowledge on
ontent Navigator Configuration and Customization. In addition, it is required to have hands-on
6 - 8 years
experience with
Hyderabad
administrator tools such as ACCE, knowl
net and C# sharp Experience with IBM FileNet P8 Experience with paper document to DataCap,
4 - 6 years
Indexing, Prepping
Hyderabad
Verification, Rules Validation and Pr
/ Case Manager and some Java/React knowledge is good to have.Good Communication skill
4 -and
6 years
also workedBangalore
in Agile environment.
istrative tools such as ACCE, ICM Admin, ICN Admin Working knowledge of deployments
4 - processes
6 years of FileNet
Hyderabad
artefacts and code promotion Good at p
.pplication
Must be aware of good
landscape practices
based related to and
on requirements Javacode
development, codeand
walk through maintenance. Basic
map to future knowledge
state4architecture
- 6 yearson on
deployment
GCPHyderabad
process Knowledge on DEVOPS and
is required.
nce leading between
ap analysis a team, set up projects
current in GCP
and to-be and
states; define process
identifies fordefines
gaps and team toand
implement future
documents 8 state
- 10 years
architecture.
requirements to close Bangalore
gaps.
6) Able to lead workshops,define best practices and must possess good communication4skills
- 6 years Bangalore
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
1 - 3ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
1 - 3ofyears
other Ab InitioPune
products would be an added advantage)
nd Spring Cloud). Expertise in Distributed computing design patterns and algorithms and8data
- 10structures
years and Bangalore
security protocols Experience in core Spark
nd Spring Cloud). Expertise in Distributed computing design patterns and algorithms and8data
- 10structures
years and Bangalore
security protocols Experience in core Spark
nd Spring Cloud). Expertise in Distributed computing design patterns and algorithms and8data
- 10structures
years and Bangalore
security protocols Experience in core Spark
nd Spring Cloud). Expertise in Distributed computing design patterns and algorithms and8data
- 10structures
years and Bangalore
security protocols Experience in core Spark
nd Spring Cloud). Expertise in Distributed computing design patterns and algorithms and8data
- 10structures
years and Bangalore
security protocols Experience in core Spark
nd Spring Cloud). Expertise in Distributed computing design patterns and algorithms and data
12+ structures
years and Bangalore
security protocols Experience in core Spark
nd Spring Cloud). Expertise in Distributed computing design patterns and algorithms and data
12+ structures
years and Bangalore
security protocols Experience in core Spark
nd Spring Cloud). Expertise in Distributed computing design patterns and algorithms and data
12+ structures
years and Bangalore
security protocols Experience in core Spark
nd Spring Cloud). Expertise in Distributed computing design patterns and algorithms and data
12+ structures
years and Bangalore
security protocols Experience in core Spark
nd Spring Cloud). Expertise in Distributed computing design patterns and algorithms and data
12+ structures
years and Bangalore
security protocols Experience in core Spark
Good technical experience building data integration processes by constructing mappings, tasks,
1 - 3 years
taskflows, schedules,
Bangaloreand parameter files. Good understan
d, ETL Design and development techniques, Create ETL jobs from source-target mapping 1documents,
- 3 years Extensive
Bangalore
ETL and SQL database skills
6 years in Ab Initio development experience with UNIX background. RDBMS concepts are4 a- 6must.
yearsHaving Mainframe
Bangaloreknowledge is an additional skill
data model and traceback issues related To data To source systems - understand interfaces 6across
- 8 years
different systems
Bangalore
and ACM. Person should have worked

Systems and Big data designPython, Hive,


framework is Spark/Scala, Sqoop, Hadoop
required to translate the source data model and 1reverse
- 3 years Pune
engineer the database into the IDA tool._x000D_
Systems and Big data design framework is required to translate the source data model and reverse engineer the database into the Data
and socialized with Client Architects and Pros/Cons provided before implementing the mapping
2 - 4 years
to Target System
PuneIIW industry model."
IDA tool._x000D_
anddata
the socialized withand
strategies Client
buildArchitects
data flowsand Pros/Cons
between provided
source before
and target withimplementing
the IT teams the
(APImapping
4 - 6 years
team to Snowflake
and Target System
Pune
IIW industry Data model."
team)_x000D_
6. Work closely with Busines 2 - 4 years Pune
Primary Skills: IBM DataStage PL/SQL Unix 4 - 6 years Pune
spects of data collection, organization, and integration Experience with design, configuration,
4 - 6 years
and rollout of Data
AnyGovernance platforms such as Collibr
tegration between Business Intelligence Platforms, which will allow users to toggle easily 6between
- 8 yearsreporting Hyderabad
and analysis tasks. The reports can be distrib
data model and traceback issues related To data To source systems - understand interfaces 6across
- 8 years
different systems
Bangalore
and ACM. Person should have worked
Python, Hive, Spark/Scala, Sqoop, Hadoop 1 - 3 years Pune

Systems and Big data design Informatica


framework isPower Exchange,
required SQL,the
to translate UNIX
source data model and 1reverse
- 3 years Pune
engineer the database into the IDA tool._x000D_
Systems and Big data design framework is required to translate the source data model and reverse engineer the database into the Data
and socialized with Client Architects and Pros/Cons provided before implementing the mapping
4 - 6 years
to Target System
PuneIIW industry model."
IDA tool._x000D_
and socialized
Systems and Bigwith Client
data designArchitects
frameworkandisPros/Cons
required toprovided
translatebefore implementing
the source data modeltheand
mapping
2reverse
- 4 years
toengineer
Target SystemPune
IIW industry
the database into the Data model."
IDA tool._x000D_
anddata
the socialized withand
strategies Client
buildArchitects
data flowsand Pros/Cons
between sourceprovided before
and target withimplementing
the IT teams the
(APImapping
4 - 6 years
team to Snowflake
and Target SystemPune
IIW industry Data model."
team)_x000D_
6. Work closely with Busines 2 - 4 years Pune
Primary Skills: IBM DataStage, PL/SQL, Unix 2 - 4 years Pune
Primary Skills: IBM DataStage, PL/SQL, Unix 4 - 6 years Pune
Primary Skills: IBM DataStage, PL/SQL, Unix 2 - 4 years Pune
Primary Skills: IBM DataStage, PL/SQL, Unix 4 - 6 years Pune
Primary Skills: IBM on
Knowledge DataStage, PL/SQL, Unix
AWS_x000D_ 4 - 6 years Pune
Knowledge on dbt (data build tool)" 4 - 6 years Pune
Primary Skills: IBM DataStage PL/SQL Unix 4 - 6 years Pune
Primary Skills: IBM DataStage PL/SQL Unix 4 - 6 years Pune
Primary Skills: IBM DataStage PL/SQL Unix 4 - 6 years Pune
Primary Skills: IBM DataStage PL/SQL Unix 4 - 6 years Pune
Primary Skills: IBM DataStage PL/SQL Unix 4 - 6 years Pune
Primary Skills: IBM DataStage PL/SQL Unix 4 - 6 years Pune
Primary Skills: IBM DataStage PL/SQL Unix 4 - 6 years Pune
queries.. Experience in Data Migration from RDBMS to Snowflake cloud data warehouse.6 .-Deep
8 years
understandingPune
of relational as well as NoSQL data stor
spects of data collection, organization, and integration Experience with design, configuration,
4 - 6 years
and rollout of Data
AnyGovernance platforms such as Collibr
apabilities needed to support business capabilities of a given business domain.Provides consulting
8 - 10 years
and guidance
Bangalore
regarding the usage of the enterprise integ
apabilities needed to support business capabilities of a given business domain.Provides consulting
6 - 8 years
and guidance
Bangalore
regarding the usage of the enterprise integ
AEP, Experience in Cloud based ETL tool and Python 6 - 8 years Bangalore
AEP, Experience
for cloud migration projects. in Cloud
Python coding along based ETLand
with scala toolexperience
and Pythonworking on GCP
6 - 8platform
years Data services
Bangaloreis required. Preferred on premises to
nd code walk through and map to future state architecture on GCP is required. Experience 4leading
- 6 years Bangalore
a team technically and review for quailty of work produ
AEP, Experience in Cloud based ETL tool and Python 4 - 6 years Bangalore
AEP, Experience in Cloud based ETL tool and Python 2 - 4 years Bangalore
AEP, Experience in Cloud based ETL tool and Python 2 - 4 years Bangalore
In depth knowledge in Azure big data development. 2 - 4 years Bangalore
oficient in Mapping Document / Data Modeling, AVRO
Additional Schema. Strong Data Modeling and
Skills: 4 - Data
6 years
Intergration
Bangalore
skills, Solid Experience with Microservice
ator (highly desirable to
Understands have) 3.value
business IBMand
Content Collector
committed 4. Active
to enable Directory
value deliveryand general
through datasecurity
4 - 6 years
knowledge Bangalore
5. WebSphere and Linux technologies.
Intrapersonal skills and ability to empower and mentor team members 8 - 10 years Bangalore
park and DBT, Data Vault/Data Warehouse experience, Strong knowledge of data in general,
4 - 6Strong
years understanding
Bangalore
of Java/Python
hitecture for data platforms. 2. Execute Data Profiling and technical analysis of the source 6data
- 8 to
years
identify theBangalore
gaps, issues and data quality problems exist
infrastructure required for optimal extraction, transformation, and loading data from a wide4 variety
- 6 yearsof data sources
Bangalore
using SQL and GCP Big Data technolo
infrastructure required for optimal extraction, transformation, and loading data from a wide2 variety
- 4 yearsof data sources
Bangalore
using SQL and GCP Big Data technolo
infrastructure required for optimal extraction, transformation, and loading data from a wide2 variety
- 4 yearsof data sources
Bangalore
using SQL and GCP Big Data technolo
infrastructure required for optimal extraction, transformation, and loading data from a wide2 variety
- 4 yearsof data sources
Bangalore
using SQL and GCP Big Data technolo
infrastructure
ap required
analysis between for optimal
current extraction,
and to-be transformation,
states; identifies and
gaps and loading
defines anddata from a wide
documents 2 variety
- 4 yearsoftodata
requirements sources
close Bangalore
gaps. using SQL and GCP Big Data technolo
6) Ablebetween
ap analysis to lead workshops,define
current and to-be best practices
states; andgaps
identifies mustand
possess good
defines andcommunication 6skills
- 8 years to close Bangalore
documents requirements gaps.
6) Able to lead workshops,define best practices and must possess good communication4skills
- 6 years Bangalore
apabilities needed to support business capabilities of a given business domain.Provides consulting
6 - 8 years
and guidance
Bangalore
regarding the usage of the enterprise integ
AEP, Experience
for cloud migration projects. in Cloud
Python coding along based ETLand
with scala toolexperience
and Pythonworking on GCP
6 - 8platform
years Data services
Bangaloreis required. Preferred on premises to
nd code walk through and map to future state architecture on GCP is required. Experience 4leading
- 6 years
a team technically
Bangaloreand review for quailty of work produ
AEP, Experience in Cloud based ETL tool and Python 4 - 6 years Bangalore
AEP, Experience in Cloud based ETL tool and Python 2 - 4 years Bangalore
AEP, Experience in Cloud based ETL tool and Python 2 - 4 years Bangalore
In depth knowledge in Azure big data development. 2 - 4 years Bangalore
gration with data sources (Mainframe files, RDBMS, SFTP, Kafka) Experience with AWS
4 - 6services
years like S3,Bangalore
DataSync, RDS. (Good to have) Informatic
oficient in Mapping Document / Data Modeling, AVRO
Additional Schema. Strong Data Modeling and
Skills: 4 - Data
6 years
Intergration
Bangalore
skills, Solid Experience with Microservice
4 - 6 years
ator (highly desirable to have) 3. IBM Content Collector 4. Active Directory and general security knowledge Bangalore
5. WebSphere and Linux technologies.
park and DBT, Data Vault/Data Warehouse experience, Strong knowledge of data in general,
4 - 6Strong
years understanding
Bangalore
of Java/Python
hitecture for data platforms. 2. Execute Data Profiling and technical analysis of the source 6data
- 8 to
years
identify theBangalore
gaps, issues and data quality problems exist
OpenText Developer (OTCS, xECM, OTAS) 8 - 10 years Exp 6 - 8 years Pune
infrastructure required for optimal extraction, transformation, and loading data from a wide4 variety
- 6 yearsof data sources
Bangalore
using SQL and GCP Big Data technolo
infrastructure required for optimal extraction, transformation, and loading data from a wide2 variety
- 4 yearsof data sources
Bangalore
using SQL and GCP Big Data technolo
infrastructure required for optimal extraction, transformation, and loading data from a wide2 variety
- 4 yearsof data sources
Bangalore
using SQL and GCP Big Data technolo
infrastructure required for optimal extraction, transformation, and loading data from a wide6 variety
- 8 yearsof data sources
Bangalore
using SQL and GCP Big Data technolo
infrastructure required for optimal extraction, transformation, and loading data from a wide2 variety
- 4 yearsof data sources
Bangalore
using SQL and GCP Big Data technolo
infrastructure
ap required
analysis between for optimal
current extraction,
and to-be transformation,
states; identifies and
gaps and loading
defines anddata from a wide
documents 2 variety
- 4 yearsoftodata
requirements sources
close Bangalore
gaps. using SQL and GCP Big Data technolo
6) Ablebetween
ap analysis to lead workshops,define
current and to-be best practices
states; andgaps
identifies mustand
possess good
defines andcommunication 6skills
- 8 years to close Bangalore
documents requirements gaps.
6) Able to lead workshops,define best practices and
Knowledge must possess good communication4skills
in Azure - 6 years Bangalore
Knowledge in any of the object oriented programming language like Java, Python etc
4 - 6 years Pune
AWS S3 Data Engineers (Glue) 8 - 10 years Pune
AWS S3 Data Engineers (Glue) 6 - 8 years Pune
Streamsets Data Engineers - 6 - 8 years Pune
DBT and Snowflake Data Engineers 6 - 8 years Pune
AWS S3 Data Engineers (Glue) 4 - 6 years Pune
DBT and Snowflake Data Engineers 4 - 6 years Pune
AWS S3 Data Engineers (Glue) 2 - 4 years Pune
Streamsets Data Engineers 2 - 4 years Pune
Streamsets Data Engineers 2 - 4 years Pune
Streamsets Data Engineers 2 - 4 years Pune
Big Data technologies including API development. Expected to have traditional Application6 Development
- 8 years background
Hyderabadalong with knowledge of Analytics
Big Data technologies including API development. Expected to have traditional Application6 Development
- 8 years background
Hyderabadalong with knowledge of Analytics
warehouses with a good understanding of cluster and parallel architecture as well as high-scale
12+ or
years
distributed RDBMS
Bangaloreand/or knowledge on NoSQL platfo
Core Skillset: Data Solutions Arch Secondary Skillset: Integration, Spark, Java, Cloud
8 - 10 years Bangalore
th building dashboard, this will be an individual contributor’s role who will be driving changes
4 - 6 by
years
themselves and
Pune
helping us build dashboards. Strong in b
m work and individual delivery.Experience with DevOps tools(git/BitBucket). Cognos data2manager.
- 4 years ETL experience.Good
Bangalore communication skills. Knowle
Code versioning (GIT/SVN) - Data ingestion from APIs - Problem solving skills - 4 - 6 Quick
years learningBangalore
of new technologies - Experience in A
m of AWS developers, engage with client on change requirement and ensure quality deliverables
8 - 10 years
within targetBangalore
deadlines. - Has a minimum of 4 years' in ro
2 - 4 years Bangalore
4 - 6 years AJAX, Webservices
n of Rules in template Strong Working knowledge in Java, J2EE , XML, HTML, Oscript, Weblingo, Bangalore and Design Patterns
OT Exstream - Developer 4 - 6 years Any
reusable objects residing on multiple platforms. Content Management addresses steps for 4developing
- 6 years a contentKolkata
strategy for an application with content as
ntent Management addresses steps for developing a content strategy for an application with
4 -content
6 yearsassets thatKolkata
need to be managed and organized: identify
e Ab-Initio, Unix, Oracle Database and Ab-Initio development experience will be an advantage.
2 - 4 years
Location: Hyderabad/Bangalore/Gurgaon.
Gurgaon
e Ab-Initio, Unix, Oracle Database and Ab-Initio development experience will be an advantage.
2 - 4 years
Location: Hyderabad/Bangalore/Gurgaon.
Gurgaon
Streamsets Data Engineers - 6 - 8 years Pune
DBT and Snowflake Data Engineers 6 - 8 years Pune
AWS S3 Data Engineers (Glue) 4 - 6 years Pune
AWS S3 Data Engineers (Glue) 4 - 6 years Pune
Streamsets Data Engineers - 4 - 6 years Pune
AWS S3 Data Engineers (Glue) 2 - 4 years Pune
Streamsets Data Engineers 2 - 4 years Pune
Streamsets Data Engineers 2 - 4 years Pune
Big Data technologies including API development. Expected to have traditional Application6 Development
- 8 years background
Hyderabadalong with knowledge of Analytics
th building dashboard, this will be an individual contributor’s role who will be driving changes
4 - 6 by
years
themselves and
Pune
helping us build dashboards. Strong in b
Code versioning (GIT/SVN) - Data ingestion from APIs - Problem solving skills - 4 - 6 Quick
years learningBangalore
of new technologies - Experience in A
2 - 4 years Bangalore
4 - 6 years Bangalore
2 - 4 years Bangalore
4 - 6 years AJAX, Webservices
n of Rules in template Strong Working knowledge in Java, J2EE , XML, HTML, Oscript, Weblingo, Bangalore and Design Patterns
OT Exstream - Developer 4 - 6 years Any
reusable objects residing on multiple platforms. Content Management addresses steps for 4developing
- 6 years a contentKolkata
strategy for an application with content as
ntent ManagementUnderstanding
addresses stepsoffor developing
Agile a content
principles strategywill
and processes for be
an aapplication
plus. with
4 -content
6 yearsassets thatKolkata
need to be managed and organized: identify
self starter and result oriented the
5. Oversee person with
teams an eye
work, andfor detailroadblocks
ensure and abilityare
to removed
provide inputs/ideas
6 - 8for
years
improvementGurgaon

Understanding of Agile6.principles
Valid Passport.
and processes will be a plus. 6 - 8 years Gurgaon
self starter and result oriented the
5. Oversee person with
teams an eye
work, andfor detailroadblocks
ensure and abilityare
to removed 6 - 8for
provide inputs/ideas years
improvementGurgaon
6. Valid Passport. 6 - 8 years Gurgaon
uding API development. Expected to have traditional Application Development background
6 - along
8 years
with knowledge
Bangalore
of Analytics libraries, open-source Na
ing with end users. A detailed understanding and experience of development best practices6in
- 8below
years areas: Hyderabad
Continuous Integration, branching and merg
ing with end users. A detailed understanding and experience of development best practices6in
- 8below
years areas: Hyderabad
Continuous Integration, branching and merg
ing with end users. A detailed understanding and experience of development best practices6in
- 8below
years areas: Hyderabad
Continuous Integration, branching and merg
ing with end users. A detailed understanding and experience of development best practices6in
- 8below
years areas: Hyderabad
Continuous Integration, branching and merg
uding API development. Expected to have traditional Application Development background
4 - along
6 years
with knowledge
Hyderabad
of Analytics libraries, open-source Na
uding API development. Expected to have traditional Application Development background
6 - along
8 years
with knowledge
Bangalore
of Analytics libraries, open-source Na
ing with end users. A detailed understanding and experience of development best practices6in
- 8below
years areas: Hyderabad
Continuous Integration, branching and merg
ing with end users. A detailed understanding and experience of development best practices6in
- 8below
years areas: Hyderabad
Continuous Integration, branching and merg
ing with end users. A detailed understanding and experience of development best practices6in
- 8below
years areas: Hyderabad
Continuous Integration, branching and merg
ing with end users. A detailed understanding and experience of development best practices6in
- 8below
years areas: Hyderabad
Continuous Integration, branching and merg
uding API development. Expected to have traditional Application Development background
4 - along
6 years
with knowledge
Hyderabad
of Analytics libraries, open-source Na
Big Data technologies including API development. Expected to have traditional Application6 Development
- 8 years background
Hyderabadalong with knowledge of Analytics
5-7 Yrs Mandatory Skills :--- ETL Informatica Good to have (Not Mandatory):-- Detailed Job
6 - 8Description
years - Experience
Mumbai in ETL Informatica.
6 - 8 years Bangalore
th building dashboard, this will be anall
Prepare individual contributor’s
documents roleobjects
for reporting who will be driving changes
4 - 6 by
years
themselves and
Pune
helping us build dashboards. Strong in b
MonitorPrepare
and resolve all desk ticket
all documents issues and
for reporting troubleshoot
objects 6 - 8 years Any
MonitorPrepare
and resolve all desk ticket
all documents issues and
for reporting troubleshoot
objects 6 - 8 years Any
Monitor and resolve all desk ticket issues and troubleshoot 6 - 8 years Any
12+ years Any
12+ years Any
perience of which 5+ years in Data Integration - Design, Analysis, Modelling, Mapping. Minimum 2+ yrs on Architect role.
Good to have - Knowledge on Retail Domain - Supply chain, JDE, Finance, HR6 - 8 years Bangalore
th building dashboard, this will be anall
Prepare individual contributor’s
documents roleobjects
for reporting who will be driving changes
4 - 6 by
years
themselves and
Pune
helping us build dashboards. Strong in b
MonitorPrepare
and resolve all desk ticket
all documents issues and
for reporting troubleshoot
objects 6 - 8 years Any
MonitorPrepare
and resolve all desk ticket
all documents issues and
for reporting troubleshoot
objects 6 - 8 years Any
Monitor and resolve all desk ticket issues and troubleshoot 6 - 8 years Any
ecture and integration.. modeling, metadata management, extract-transform-load (ETL), data
6 -staging
8 yearstechniques, Any
to integrate with multiple end systems to

5. Oversee the teams work, and ensure roadblocks are removed 4 - 6 years Bangalore
6. Valid
5. Oversee the teams work, andPassport.
ensure roadblocks are removed 4 - 6 years Gurgaon
6. Valid
5. Oversee the teams work, andPassport.
ensure roadblocks are removed 4 - 6 years Gurgaon
6. Valid Passport.
5. Oversee the teams work, and ensure roadblocks are removed 6 - 8 years Gurgaon
6. Valid
5. Oversee the teams work, andPassport.
ensure roadblocks are removed 4 - 6 years Gurgaon
6. Valid Passport.
5. Oversee the teams work, and ensure roadblocks are removed 4 - 6 years Gurgaon
6. Valid Passport. 6 - 8 years Gurgaon
6 - 8 years Hyderabad
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n AbExperience
Initio development - GDE,
in building Co-op>
scalable 3.4.x , data
end-to-end Plan,ingestion
Metaprogramming, MHUB
and processing (Knowledge
solutions 2 - 4ofyears
_x000D_ other Ab InitioPune
products would be an added advantage)
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing 4 - and
Python,_x000D_
solutions Java 6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
2 - and
4 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented and/or functional programming languages, such as Python, Java
4 - and
6 years
Scala" Pune
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
6 - 8ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
6 - 8ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
6 - 8ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n AbExperience
Initio development - GDE,
in building Co-op>
scalable 3.4.x , data
end-to-end Plan,ingestion
Metaprogramming, MHUB
and processing (Knowledge
solutions 2 - 4ofyears
_x000D_ other Ab InitioPune
products would be an added advantage)
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
6 - and
8 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing 4 - and
Python,_x000D_
solutions Java 6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
6 - and
8 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing 4 - and
Python,_x000D_
solutions Java 6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
6 - and
8 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented and/or
- Experience on functional
SAS DI willprogramming
be an addedlanguages, such as Python, Java
advantage_x000D_ 4 - and
6 years
Scala" Pune
- Strong Data Analysis and Problem Solving Skills" 2 - 4 years Pune
uide the reporting team as well (Review the reporting solution ) - Understand Data-warehousing
8 - 10 years
( Data Vault Bangalore
Modeling is a plus) - Capable to guide the te
rchitect to ensure appropriate requirements are incorporated. Responsible for ensuring
8 - 10
Data
years
Quality Management
Pune standards are adhered to in new p
nalysis and ETL development activities. Worked with Big Data projects across Hadoop 6and
- 8Cloud
years technologies
Gurgaon
(or equivalents) S3, Hive, Spark, Pre
ontentdata
ment Navigator
modelsConfiguration and Customization.
according to business requirementsInfor
addition,
specificit use
is required to have
cases and/or hands-on
client’s6business
- 8 years
experience with
Hyderabad
administrator tools such as ACCE, knowl
domains_x000D_
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or 4business
- 6 yearsdomains_x000D_
Dimensional
client’s Hyderabad
Modelling, Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’sDimensional
6business
- 8 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’sDimensional
6business
- 8 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’sDimensional
6business
- 8 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’sDimensional
6business
- 8 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship
a modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional
6 - 8 years Modelling,
Hyderabad
Entity-Relationship
ment data models according to business requirements for specific use cases and/or client’s business domains_x000D_
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’sDimensional
6business
- 8 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’sDimensional
6business
- 8 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship
a consulting
modelling schema designregarding
and guidance approaches
theand techniques
usage such as Third
of the enterprise Normal
integrated Form
logical (3NF),
model forDimensional
6use
- 8 in
years Modelling,
Hyderabad
development. Entity-Relationship

ment7.data
Have deep knowledge
models according toofbusiness
data architectures,
requirementsODSs, Data warehouses
for specific client’s6business
and methodologies.
use cases and/or - 8 yearsdomains_x000D_
Pune
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’s8Dimensional
- 10 yearsdomains_x000D_
business Modelling,
Hyderabad
Entity-Relationship
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’s8Dimensional
- 10 yearsdomains_x000D_
business Modelling,
Hyderabad
Entity-Relationship
adata
modelling
modelsschema design
according approaches
to business and techniques
requirements such as
for specific useThird
casesNormal
and/or Form (3NF),
client’s 8Dimensional
- 10
business years Modelling,
domains_x000D_Hyderabad
Entity-Relationship
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or Dimensional
client’s 4 - 6domains_x000D_
business years
Modelling, Entity-Relationship
Hyderabad (ER) Models a
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or Dimensional
client’s 4 - 6domains_x000D_
business years
Modelling, Entity-Relationship
Hyderabad (ER) Models a
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or Dimensional
client’s 4 - 6domains_x000D_
business years
Modelling, Entity-Relationship
Hyderabad (ER) Models a
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or Dimensional
client’s 4 - 6domains_x000D_
business years
Modelling, Entity-Relationship
Hyderabad (ER) Models a
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or 4 - 6domains_x000D_
Dimensional
client’s business years Hyderabad
Modelling, Entity-Relationship (ER) Models a
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or Dimensional
client’s 4 - 6domains_x000D_
business years
Modelling, Entity-Relationship
Hyderabad (ER) Models a
elling schemaaccording
data models design approaches
to businessand techniques for
requirements suchspecific
as Third
useNormal Form (3NF),
cases and/or client’sDimensional
6 - 8domains_x000D_
business years
Modelling,Hyderabad
Entity-Relationship (ER) Mode
elling schemaaccording
data models design approaches
to businessand techniques for
requirements suchspecific
as Third
useNormal Form (3NF),
cases and/or client’sDimensional
6 - 8domains_x000D_
business years
Modelling,Hyderabad
Entity-Relationship (ER) Mode
elling schemaaccording
data models design approaches
to businessand techniques for
requirements suchspecific
as Third
useNormal Form (3NF),
cases and/or client’sDimensional
6 - 8domains_x000D_
business years
Modelling,Hyderabad
Entity-Relationship (ER) Mode
elling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional 6 - 8 years
Modelling,Hyderabad
Entity-Relationship (ER) Mode
gn kit. Includes integration between Business Intelligence Platforms, which will allow users
4 - to
6 years
toggle easily between
NOIDAreporting and analysis tasks. The re
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
echnologies - ex. Spark, Hive, Hadoop, Presto, Kafka etc Understanding or hands on experience
8 - 10 yearsof data visualization
Hyderabad challenges, tools used (Qlikview
echnologies - ex. Spark, Hive, Hadoop, Presto, Kafka etc Understanding or hands on experience
8 - 10 yearsof data visualization
Hyderabad challenges, tools used (Qlikview
echnologies - ex. Spark, Hive, Hadoop, Presto, Kafka etc Understanding or hands on experience
8 - 10 yearsof data visualization
Hyderabad challenges, tools used (Qlikview
echnologies - ex. Spark, Hive, Hadoop, Presto, Kafka etc Understanding or hands on experience
8 - 10 yearsof data visualization
Hyderabad challenges, tools used (Qlikview
echnologies - ex. Spark, Hive, Hadoop, Presto, Kafka etc Understanding or hands on experience
8 - 10 yearsof data visualization
Hyderabad challenges, tools used (Qlikview
echnologies - ex. Spark, Hive, Hadoop, Presto, Kafka etc Understanding or hands on experience
8 - 10 yearsof data visualization
Hyderabad challenges, tools used (Qlikview
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
DataStage, QualityStage, IA, Data Migration 8 - 10 years Gurgaon
nalysis and ETL development activities. Worked with Big Data projects across Hadoop 2and
- 4Cloud
years technologies
Gurgaon
(or equivalents) S3, Hive, Spark, Pre
roject plan, budget, structure, schedule and staffing requirements, including IBM and Client
8 - employees
10 years and 3rd
Bangalore
party vendors. Requires experience with P
n OLTP and OLAP models Experience working on Tableau 9.x (Desktop, Server, Reader),4 creating
- 6 yearsvarious Reports
Any and Dashboards using different funct
olutions Using SQL; Design Data Lake Solutions; Design Solutions Using Hadoop; Design
8 -Solutions
10 years Using MapReduce;
Bangalore Design Solutions Using HDFS;
oud) '- Analysis of applications specifically from standpoint of extraction, structured and unstructured
6 - 8 years data repository
Bangalore
management '- Design and implemen
build and manage solutions using Web-based BI reporting using MicroStrategy Other Preferred
6 - 8 years
Qualifications:Any
Agile experience is preferred
Data Modelling with Banking domain experience 6 - 8 years Bangalore
Data Modelling with Banking domain experience 4 - 6 years Bangalore
nalysis and ETL development activities. Worked with Big Data projects across Hadoop 6and
- 8Cloud
years technologies
Gurgaon
(or equivalents) S3, Hive, Spark, Pre
ontentdata
ment Navigator
modelsConfiguration and Customization.
according to business requirementsInfor
addition,
specificit use
is required to have
cases and/or hands-on
client’s6business
- 8 years
experience with
Hyderabad
administrator tools such as ACCE, knowl
domains_x000D_
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or 4business
- 6 yearsdomains_x000D_
Dimensional
client’s Hyderabad
Modelling, Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’sDimensional
6business
- 8 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’sDimensional
6business
- 8 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship
a modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional
6 - 8 years Modelling,
Hyderabad
Entity-Relationship
ment data models according to business requirements for specific use cases and/or client’s business domains_x000D_
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’sDimensional
6business
- 8 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’sDimensional
6business
- 8 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’sDimensional
6business
- 8 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or 6business
- 8 yearsdomains_x000D_
client’sDimensional Hyderabad
Modelling, Entity-Relationship
adata
modelling
modelsschema design
according approaches
to business and techniques
requirements such as
for specific useThird
casesNormal
and/or Form (3NF),
client’s Dimensional
6 - 8domains_x000D_
business years Modelling,
Hyderabad
Entity-Relationship
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or Dimensional
client’s 4 - 6domains_x000D_
business years
Modelling, Entity-Relationship
Hyderabad (ER) Models a
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or Dimensional
client’s 4 - 6domains_x000D_
business years
Modelling, Entity-Relationship
Hyderabad (ER) Models a
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or Dimensional
client’s 4 - 6domains_x000D_
business years
Modelling, Entity-Relationship
Hyderabad (ER) Models a
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or Dimensional
client’s 4 - 6domains_x000D_
business years
Modelling, Entity-Relationship
Hyderabad (ER) Models a
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or Dimensional
client’s 4 - 6domains_x000D_
business years
Modelling, Entity-Relationship
Hyderabad (ER) Models a
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or Dimensional
client’s 4 - 6domains_x000D_
business years
Modelling, Entity-Relationship
Hyderabad (ER) Models a
elling schemaaccording
data models design approaches
to businessand techniques for
requirements suchspecific
as Third
useNormal Form (3NF),
cases and/or 6 - 8domains_x000D_
client’sDimensional
business years
Modelling,Hyderabad
Entity-Relationship (ER) Mode
elling schemaaccording
data models design approaches
to businessand techniques for
requirements suchspecific
as Third
useNormal Form (3NF),
cases and/or client’sDimensional
6 - 8domains_x000D_
business years
Modelling,Hyderabad
Entity-Relationship (ER) Mode
elling schemaaccording
data models design approaches
to businessand techniques for
requirements suchspecific
as Third
useNormal Form (3NF),
cases and/or client’sDimensional
6 - 8domains_x000D_
business years
Modelling,Hyderabad
Entity-Relationship (ER) Mode
elling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional
6 - 8 years
Modelling,Hyderabad
Entity-Relationship (ER) Mode
gn kit. Includes integration between Business Intelligence Platforms, which will allow users
6 - to
8 years
toggle easily between
Bangalore
reporting and analysis tasks. The re
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
nalysis and ETL development activities. Worked with Big Data projects across Hadoop 2and
- 4Cloud
years technologies
Gurgaon
(or equivalents) S3, Hive, Spark, Pre
n OLTP and OLAP models Experience working on Tableau 9.x (Desktop, Server, Reader),4 creating
- 6 yearsvarious Reports
Any and Dashboards using different funct
ude designing and developing extract, transform and load (ETL) processes. Experience includes
1 - 3 years
full lifecycleHyderabad
implementation of the technical components
ude designing and developing extract, transform and load (ETL) processes. Experience includes
2 - 4 years
full lifecycleHyderabad
implementation of the technical components
ude designing and developing extract, transform and load (ETL) processes. Experience includes
2 - 4 years
full lifecycleHyderabad
implementation of the technical components
ude designing and developing extract, transform and load (ETL) processes. Experience includes
2 - 4 years
full lifecycleHyderabad
implementation of the technical components
iving long term data architecture roadmaps in alignment with corporate strategic objectives.
6 -Experience
8 years with Bangalore
following message queuing, stream processi
build and manage solutions using Web-based BI reporting using MicroStrategy Other Preferred
6 - 8 years
Qualifications:Any
Agile experience is preferred
build and manage solutions using Web-based BI reporting using MicroStrategy Other Preferred
6 - 8 years
Qualifications:Any
Agile experience is preferred
Data Modelling with Banking domain experience 6 - 8 years Bangalore
Data Modelling with Banking domain experience 4 - 6 years Bangalore
t, schedule, and contractual deliverables, which includes applying techniques for planning,6tracking,
- 8 years changeNavi
control,
Mumbai
and risk management.
t, schedule, and contractual deliverables, which includes applying techniques for planning,6tracking,
- 8 years changeNavi
control,
Mumbai
and risk management.
Nice to have Skill: Any BI Reporting tools, Cloud Services in Data Analytics experience. 4Should
- 6 years
independantly
Bangalore
work with business IT and command ove
Nice to have Skill: Any BI Reporting tools, Cloud Services in Data Analytics experience. 4Should
- 6 years
independantly
Bangalore
work with business IT and command ove
Nice to have Skill: Any BI Reporting tools, Cloud Services in Data Analytics experience. 4Should
- 6 years
independantly
Bangalore
work with business IT and command ove
Nice to have Skill: Any BI Reporting tools, Cloud Services in Data Analytics experience. 4Should
- 6 years
independantly
Bangalore
work with business IT and command ove
acle, Greenplum. Experience working with Data Warehousing, ETL Development and ETL
4 -Architecture
6 years Excellent
Bangalore
communication and troubleshooting sk
using Informatica Cloud (IICS) 2.Experience of setting informatica pipeline with snow flake
6 - 8asyears
target 3.Knowledge
Kolkataon CI CD using bit bucket and jenkin
rs of experience with change management procedures and SDLC is a must Experience with
4 -analytical
6 years and operational
Bangaloredata Store modeling Experience wi
sources Experience with Data Marts, Data Warehouse structures (e.g., star schema, fact and
6 -dimensions)
8 years 3+ years
Kolkata
Database Experience including Oracle
ound data profiling, cleansing, parsing, standardization, verification, matching, rules and data
4 - quality
6 yearsexception monitoring
Any and handling.
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data2lake
- 4 years Gurgaon
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data2lake
- 4 years Gurgaon
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data2lake
- 4 years Gurgaon
Should have atleast 7 + years of experience in Google Big Query, ETL, Python, Data4lake
- 6 years Gurgaon
Should have atleast 7 + years of experience in Google Big Query, ETL, Python, Data4lake
- 6 years Gurgaon
Should have atleast 9 + years of experience in Google Big Query, ETL, Python, Data8 lake
- 10 years Gurgaon
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data1lake
- 3 years Gurgaon
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data1lake
- 3 years Gurgaon
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data1lake
- 3 years Gurgaon
Application consultant ( Knowledge of xECM and Archive Server), open text 6 - 8 years Ahmedabad
OpenText ECM Consultant / Developer JD : Knowledge of Development and Business workspace)
4 - 6 years Ahmedabad
Google tools like BigQuery, App Engine, AirFlow, Cloud Composer, etc...+ Data lakes...data
4 - 6ponds..domain
years Bangalore
driven design....data dictionaries, content ca
Google tools like BigQuery, App Engine, AirFlow, Cloud Composer, etc...+ Data lakes...data
4 - 6ponds..domain
years Bangalore
driven design....data dictionaries, content ca
Google tools like BigQuery, App Engine, AirFlow, Cloud Composer, etc...+ Data lakes...data
4 - 6ponds..domain
years Bangalore
driven design....data dictionaries, content ca
Google tools like BigQuery, App Engine, AirFlow, Cloud Composer, etc...+ Data lakes...data
6 - 8ponds..domain
years Bangalore
driven design....data dictionaries, content ca
n - SQL DB and BI Tools - GCP Big Query (preferred) Nice to have Skills: - Java 8 - Python
2 - 4- years
OLAP Cubes Bangalore
and Star Schema
n - SQL DB and BI Tools - GCP Big Query (preferred) Nice to have Skills: - Java 8 - Python
6 - 8- years
OLAP CubesBangalore
and Star Schema
concepts Google CloudonPlatform
- To work expertiseActivities
Data Migration Data Engineer Certification
from current is preferred
environment Coding4 &
to Azure - 6CI/CD
years GitHubBangalore
Management SQL Secondary Skills: Data W
- -Execute
To worktasks based
on Data on directions
Migration from from
Activities the Technical Team lead / to
current environment Architect
Azure 2 - 4 years Bangalore
- Execute tasks based on directions from the Technical Team lead / Architect 2 - 4 years Bangalore
ork on solution design of ETL activity - should have a sound knowledge in Informatica Powercenter
4 - 6 yearsdevelopment
Bangalore
along with SQL knowledge - should hav
rk) Experience in Big Data file formats (e.g. Parquet) Experience in object-oriented programming
4 - 6 years
languages (Python)
BangaloreSolid understanding of Data Model
rk) Experience in Big Data file formats (e.g. Parquet) Experience in object-oriented programming
4 - 6 years
languages (Python)
BangaloreSolid understanding of Data Model
4 - 6 years Bangalore
MDM Sustain developer 4 - 6 years Kolkata
strong knowledge in areas SAS development experience with SAS Base, SAS DI, SAS Enterprise
1 - 3 years
Guide andHyderabad
must have SAS ETL development skills.
Snowflake / Wherescape developer 6 - 8 years Bangalore
Snowflake / Wherescape developer 6 - 8 years Bangalore
st Export, Fast Load,• MultiLoad,
Should haveTPump and TPT),
the capability Oracle
to work 8.1minimal
with with ETL knowledge Good4to
guidance - 6have
years
(Not Mandatory):
BangaloreSQL,UNIX/Linux Shell Scripting, P
• Knowledge
• Should have the of Agile way
capability of working
to work is preferred
with minimal guidance 4 - 6 years Pune
• Knowledge
1. Experience of of Azure
working with Agile way
cloudofbased
working is preferred of Snowflake4 - 6 years
implementations Pune
2. Experience in Data Migration from RDBMS to Snowflake cloud data warehouse
4 - 6 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark4 - 6 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark6 - 8 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark2 - 4 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark2 - 4 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark6 - 8 years Pune
Nice to have Skill: Any BI Reporting tools, Cloud Services in Data Analytics experience. 4Should
- 6 years
independantly
Bangalore
work with business IT and command ove
Nice to have Skill: Any BI Reporting tools, Cloud Services in Data Analytics experience. 4Should
- 6 years
independantly
Bangalore
work with business IT and command ove
Nice to have Skill: Any BI Reporting tools, Cloud Services in Data Analytics experience. 4Should
- 6 years
independantly
Bangalore
work with business IT and command ove
Nice to have Skill: Any BI Reporting tools, Cloud Services in Data Analytics experience. 4Should
- 6 years
independantly
Bangalore
work with business IT and command ove
Nice to have Skill: Any BI Reporting tools, Cloud Services in Data Analytics experience. 4Should
- 6 years
independantly
Bangalore
work with business IT and command ove
on programming language Debugging/troubleshooting of Spark jobs Performance tuning experience
4 - 6 yearsfor Hadoop/Spark
Bangalorejobs Good To Have Hands-on deve
on programming language Debugging/troubleshooting of Spark jobs Performance tuning experience
4 - 6 yearsfor Hadoop/Spark
Bangalorejobs Good To Have Hands-on deve
acle, Greenplum. Experience working with Data Warehousing, ETL Development and ETL
4 -Architecture
6 years Excellent
Bangalore
communication and troubleshooting sk
t , waterfall chart pie charts etc. Knowledge in connecting HANA DB to Tableau Knowledge
6 - 8inyears
SQL related toKolkata
Tableau End to End experience in Tablea
MDM hub development, MDM File Import process, design/build MDM Batch Jobs set up Strong
4 - 6 years
ability to understand,
Any document and communicate technic
ke Architecture experience in designing and developing projects. Understand ETL, primarily
6 - 8informatica,
years forBangalore
data engineering and knowledge of SQL. Cr
rs of experience with change management procedures and SDLC is a must Experience with
4 -analytical
6 years and operational
Bangaloredata Store modeling Experience wi
sources Experience with Data Marts, Data Warehouse structures (e.g., star schema, fact and
6 -dimensions)
8 years 3+ years
Kolkata
Database Experience including Oracle
ound data profiling, cleansing, parsing, standardization, verification, matching, rules and data
4 - quality
6 yearsexception monitoring
Any and handling.
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data2lake
- 4 years Gurgaon
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data2lake
- 4 years Gurgaon
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data2lake
- 4 years Gurgaon
Should have atleast 7 + years of experience in Google Big Query, ETL, Python, Data4lake
- 6 years Gurgaon
Should have atleast 7 + years of experience in Google Big Query, ETL, Python, Data4lake
- 6 years Gurgaon
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data1lake
- 3 years Gurgaon
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data1lake
- 3 years Gurgaon
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data1lake
- 3 years Gurgaon
Application consultant ( Knowledge of xECM and Archive Server), open text 6 - 8 years Ahmedabad
OpenText ECM Consultant / Developer JD : Knowledge of Development and Business workspace)
4 - 6 years Ahmedabad
Google tools like BigQuery, App Engine, AirFlow, Cloud Composer, etc...+ Data lakes...data
4 - 6ponds..domain
years Bangalore
driven design....data dictionaries, content ca
Google tools like BigQuery, App Engine, AirFlow, Cloud Composer, etc...+ Data lakes...data
4 - 6ponds..domain
years Bangalore
driven design....data dictionaries, content ca
Google tools like BigQuery, App Engine, AirFlow, Cloud Composer, etc...+ Data lakes...data
6 - 8ponds..domain
years Bangalore
driven design....data dictionaries, content ca
Google tools like BigQuery, App Engine, AirFlow, Cloud Composer, etc...+ Data lakes...data
6 - 8ponds..domain
years Bangalore
driven design....data dictionaries, content ca
n - SQL DB and BI Tools - GCP Big Query (preferred) Nice to have Skills: - Java 8 - Python
2 - 4- years
OLAP Cubes Bangalore
and Star Schema
rs of experience on - ETL Developer Nice to have Skills:
•Experience - Java 8 - Python - OLAP Cubes4 and
in leading - 6 years
Star Schema Bangalore
•Product Ownership experience 6 - 8 years Bangalore
concepts Google CloudonPlatform
- To work expertiseActivities
Data Migration Data Engineer Certification
from current is preferred
environment Coding4 &
to Azure - 6CI/CD
years GitHubBangalore
Management SQL Secondary Skills: Data W
- -Execute
To worktasks based
on Data on directions
Migration from from
Activities the Technical Team lead / to
current environment Architect
Azure 2 - 4 years Bangalore
- -Execute
To worktasks based
on Data on directions
Migration from from
Activities the Technical Team lead / to
current environment Architect
Azure 2 - 4 years Bangalore
- Execute tasks based on directions from the Technical Team lead / Architect 2 - 4 years Bangalore
Informatica power center. Should be able to handle the operational activities independently
4 -and
6 years
resolve issues.Very
Bangalore
Good SQL knowledge and DW conc
TL & SQL) knowledge - Able to do dependency analysis for change request & present it before
4 - 6 years
client in a non-ambiguous
Bangalore way. Candidate should also be
rk) Experience in Big Data file formats (e.g. Parquet) Experience in object-oriented programming
4 - 6 years
languages (Python)
BangaloreSolid understanding of Data Model
rk) Experience in Big Data file formats (e.g. Parquet) Experience in object-oriented programming
4 - 6 years
languages (Python)
BangaloreSolid understanding of Data Model
4 - 6 years Bangalore
MDM Sustain developer 4 - 6 years Kolkata
MDM Sustain developer 4 - 6 years Kolkata
strong knowledge in areas SAS development experience with SAS Base, SAS DI, SAS Enterprise
1 - 3 years
Guide andHyderabad
must have SAS ETL development skills.
Snowflake / Wherescape developer 6 - 8 years Bangalore
Snowflake / Wherescape developer 6 - 8 years Bangalore
Datawarehouse/BigQuery/Big data Developer/Analyst/Data Engineer 4 - 6 years Hyderabad
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark4 - 6 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark2 - 4 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark2 - 4 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark2 - 4 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark2 - 4 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark4 - 6 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark6 - 8 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark2 - 4 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark6 - 8 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark2 - 4 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark6 - 8 years Pune
perience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.2--Experience/knowledge
4 years Gurgaon
on AWS cloud services - Experience
bjects in the application. Migration and deployment of objects among Development, Test,
4 - Production
6 years Environments.
Hyderabad Reviewing objects prepared by
Big Data technologies including API development. Expected to have traditional Application6 Development
- 8 years background
Hyderabadalong with knowledge of Analytics
perience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.2--Experience/knowledge
4 years Gurgaon
on AWS cloud services - Experience
QL Server Strong on DW Fundamentals, concepts. Strong knowledge of database
6 - 8 design
years and entity
Hyderabad
relationships. Data warehouse and
experience -3 years of SQL and unix experience -2 Years of Datawarehousing Experience4-Nice
- 6 years
to have experience
Bangalore
in Java
omponents -Expert with ETL tools using REST and SOAP API. -understanding of Salesforce
6 - 8Data
years
Models and
Bangalore
Informatica components -SQL knowlege -
ustry d. Should be well conversant in English and should have excellent writing, MIS, communication,
2 - 4 years time management
Any and multi-tasking skills   Shoul
ETL Design and development techniques, Create ETL jobs from source-target mapping documents,
4 - 6 years Extensive
Bangalore
ETL and SQL database skills
th building dashboard, this will be an individual contributor’s role who will be driving changes
2 - 4 by
years
themselves and
Pune
helping us build dashboards. Strong in b
th building dashboard, this will be an individual contributor’s role who will be driving changes
2 - 4 by
years
themselves and
Pune
helping us build dashboards. Strong in b
th building dashboard, this will be an individual contributor’s role who will be driving changes
2 - 4 by
years
themselves and
Pune
helping us build dashboards. Strong in b
th building dashboard, this will be an individual contributor’s role who will be driving changes
2 - 4 by
years
themselves and
Pune
helping us build dashboards. Strong in b
on-prod sites. Refreshing Extracts on non-prod sites. Monitoring the status of extract schedules.
6 - 8 years
Performing Unit
Bangalore
Testing on non-prod. Resolving any data
Primary Skills: IBM DataStage, PL/SQL, Unix 2 - 4 years Any
Primary Skills: IBM DataStage, PL/SQL, Unix 2 - 4 years Any
nd analytical layer Data integration architecture design estimate data engineer efforts (pipelines
6 - 8 years
for ingestion, transformation
Kolkata and storage destination) desi
namoDB / AWS Kinesis / Apache Kafka / Apache EMR / AWS Glue / Apache Airflow / Lambda
6 - 8 years
+ Step Functions
Bangalore
/ OpenShift / AWS EKS
Azure - Data Bricks - Spark, Synapse Analytics, ADF - Python 6 - 8 years Bangalore
Azure - Data Bricks - Spark, Synapse Analytics, ADF - Python 6 - 8 years Bangalore
Azure - Data Bricks - Spark, Synapse Analytics, ADF - Python 6 - 8 years Bangalore
OpenSource - Big Data - Spark / PySpark or Scala or Java + Talend / Pentaho 6 - 8 years Bangalore
OT products - Document Presentation for SAP Solutions including Live S/4 addon - Imaging
6 -Enterprise
8 years Scan Hyderabad
Exposure to other OT products, Some of the
data model and traceback issues related To data To source systems - understand interfaces 6across
- 8 years
different systems
Bangalore
and ACM. Person should have worked
t , waterfall chart pie charts etc. Knowledge in connecting
Special HANA DB to Tableau Knowledge
Challenges: 8 - 10
in years
SQL related toKolkata
Tableau End to End experience in Tablea
oncile different – often competing – interests andChallenges:
Special requirements applicable to solutions, and6the
- 8need
yearsto escalateBangalore
conflicts timely as appropriate.
oncile different – often competing – interests and requirements applicable to solutions, and6the
- 8need
yearsto escalateBangalore
conflicts timely as appropriate.
Special Challenges:
oncile different – often competing – interests andChallenges:
Special requirements applicable to solutions, and6the
- 8need
yearsto escalateBangalore
conflicts timely as appropriate.
oncile different – often competing – interests andChallenges:
Special requirements applicable to solutions, and6the
- 8need
yearsto escalateBangalore
conflicts timely as appropriate.
oncile different – often competing – interests and requirements applicable to solutions, and6the
- 8need
yearsto escalateBangalore
conflicts timely as appropriate.
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
echnical requirements into detailed design. Perform analysis of vast data stores and uncover
6 - 8 insights.
years Maintain
Bangalore
security and data privacy. Create sc
echnical requirements into detailed design. Perform analysis of vast data stores and uncover
6 - 8 insights.
years Maintain
Bangalore
security and data privacy. Create sc
echnical requirements into detailed design. Perform analysis of vast data stores and uncover
6 - 8 insights.
years Maintain
Bangalore
security and data privacy. Create sc
querying tools, such as Hive and Impala Basic knowledge on LINUX commands. Good communication
6 - 8 years skills.
Bangalore
Follow organization's support process in r
and developing dashboards, reports, visualizations and storytelling techniques, and customizing
2 - 4 years
out of the box
Bangalore
widgets.     Hands on expertise on Qlik scr
on-prod sites. Refreshing Extracts on non-prod sites. Monitoring the status of extract schedules.
2 - 4 years
Performing Unit
Bangalore
Testing on non-prod. Resolving any data
on-prod sites. Refreshing Extracts on non-prod sites. Monitoring the status of extract schedules.
2 - 4 years
Performing Unit
Bangalore
Testing on non-prod. Resolving any data
on-prod sites. Refreshing Extracts on non-prod sites. Monitoring the status of extract schedules.
4 - 6 years
Performing Unit
Bangalore
Testing on non-prod. Resolving any data
on-prod sites. Refreshing Extracts on non-prod sites. Monitoring the status of extract schedules.
4 - 6 years
Performing Unit
Bangalore
Testing on non-prod. Resolving any data
on-prod sites. Refreshing Extracts on non-prod sites. Monitoring the status of extract schedules.
6 - 8 years
Performing Unit
Bangalore
Testing on non-prod. Resolving any data
Primary Skills: IBM DataStage, PL/SQL, Unix 2 - 4 years Any
Primary Skills: IBM DataStage, PL/SQL, Unix 2 - 4 years Any
namoDB / AWS Kinesis / Apache Kafka / Apache EMR / AWS Glue / Apache Airflow / Lambda
4 - 6 years
+ Step Functions
Bangalore
/ OpenShift / AWS EKS
namoDB / AWS Kinesis / Apache Kafka / Apache EMR / AWS Glue / Apache Airflow / Lambda
4 - 6 years
+ Step Functions
Bangalore
/ OpenShift / AWS EKS
namoDB / AWS Kinesis / Apache Kafka / Apache EMR / AWS Glue / Apache Airflow / Lambda
6 - 8 years
+ Step Functions
Bangalore
/ OpenShift / AWS EKS
namoDB / AWS Kinesis / Apache Kafka / Apache EMR / AWS Glue / Apache Airflow / Lambda
6 - 8 years
+ Step Functions
Bangalore
/ OpenShift / AWS EKS
Azure - Data Bricks - Spark, Synapse Analytics, ADF - Python 6 - 8 years Bangalore
Azure - Data Bricks - Spark, Synapse Analytics, ADF - Python 6 - 8 years Bangalore
Azure - Data Bricks - Spark, Synapse Analytics, ADF - Python 6 - 8 years Bangalore
pment of ETL mappings, workflows Strong SQLChallenges:
Special skills in database languages like Oracle /4Netezza/
- 6 yearsDB2 Preferred
Mumbaidevelopment skills in Unix scripting
oncile different – often competing – interests andChallenges:
Special requirements applicable to solutions, and6the
- 8need
yearsto escalateBangalore
conflicts timely as appropriate.
oncile different – often competing – interests andChallenges:
Special requirements applicable to solutions, and6the
- 8need
yearsto escalateBangalore
conflicts timely as appropriate.
oncile different – often competing – interests andChallenges:
Special requirements applicable to solutions, and6the
- 8need
yearsto escalateBangalore
conflicts timely as appropriate.
oncile different – often competing – interests requirements applicable to solutions, and6the
andChallenges:
Special - 8need
yearsto escalateBangalore
conflicts timely as appropriate.
oncile different – often competing – interests and requirements applicable to solutions, and6the
- 8need
yearsto escalateBangalore
conflicts timely as appropriate.
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
echnical requirements into detailed design. Perform analysis of vast data stores and uncover
6 - 8 insights.
years Maintain
Bangalore
security and data privacy. Create sc
echnical requirements into detailed design. Perform analysis of vast data stores and uncover
6 - 8 insights.
years Maintain
Bangalore
security and data privacy. Create sc
echnical requirements into detailed design. Perform analysis of vast data stores and uncover
6 - 8 insights.
years Maintain
Bangalore
security and data privacy. Create sc
querying
• tools, such
Mandatory asrequired
skills Hive and– Impala Basic knowledge
Spark, Scala, on LINUX
Oozie, HIVE, commands.
Shell script, Jenkins,Good communication
6 -Github
Ansible, 8 years skills.
Bangalore
Follow organization's support process in r
Good to -have
Goodskills
oral–and
Nifi, Elastic,
written KIBANA, GRAFANA,
communication abilities Kafka 2 - 4 years Bangalore
- Proficient in MS Office tools and SDLC life cycle. 2 - 4 years Bangalore
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment
• Console,
MandatoryTalend
skillsJobServer,
required – Talend Runtime,
Spark, Scala, Talend
Oozie, Remote
HIVE, ShellEngine, and all other
script, Jenkins, server
6 -Github
Ansible, 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl

• Mandatory Good to have skills


skills required – Nifi,
– Spark, Elastic,
Scala, Oozie,KIBANA, GRAFANA,
HIVE, Shell Kafka Ansible,
script, Jenkins, 2 -Github
4 years Bangalore
Good to -have
Goodskills
oral–and
Nifi, Elastic,
written KIBANA, GRAFANA,
communication abilities Kafka 4 - 6 years Bangalore
- Proficient
Hands in MSonOffice
on experience tools frameworks
Java/Web and SDLC life cycle.
is mandatory 2 - 4 years Bangalore
Experience on ML techniques is added advantage. 4 - 6 years Hyderabad
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
essing workflows using Hadoop and frameworks such as Spark and Cloudera Hands-on experience
4 - 6 yearson broadcasting
Anytools like Kafka, Event Hub, AWS K
essing workflows using Hadoop and frameworks such as Spark and Cloudera Hands-on experience
4 - 6 yearson broadcasting
Anytools like Kafka, Event Hub, AWS K
essing workflows using Hadoop and frameworks such as Spark and Cloudera Hands-on experience
4 - 6 yearson broadcasting
Anytools like Kafka, Event Hub, AWS K
essing workflows using Hadoop and frameworks such as Spark and Cloudera Hands-on experience
4 - 6 yearson broadcasting
Anytools like Kafka, Event Hub, AWS K
essing workflows using Hadoop and frameworks such as Spark and Cloudera Hands-on experience
4 - 6 yearson broadcasting
Anytools like Kafka, Event Hub, AWS K
essing workflows using Hadoop and frameworks such as Spark and Cloudera Hands-on experience
4 - 6 yearson broadcasting
Anytools like Kafka, Event Hub, AWS K
essing workflows using Hadoop and frameworks such as Spark and Cloudera Hands-on experience
4 - 6 yearson broadcasting
Anytools like Kafka, Event Hub, AWS K
essing workflows using Hadoop and frameworks such as Spark and Cloudera Hands-on experience
4 - 6 yearson broadcasting
Anytools like Kafka, Event Hub, AWS K
essing workflows using Hadoop and frameworks such as Spark and Cloudera Hands-on experience
4 - 6 yearson broadcasting
Anytools like Kafka, Event Hub, AWS K
essing workflows using Hadoop and frameworks such as Spark and Cloudera Hands-on experience
4 - 6 yearson broadcasting
Anytools like Kafka, Event Hub, AWS K
t understanding of various RDBMS - Should have Good communication skills Performing4issue
- 6 years
analysis & providing
Hyderabadsolution on Informatica DIH applic
t understanding of various RDBMS - Should have Good communication skills Performing4issue
- 6 years
analysis & providing
Hyderabadsolution on Informatica DIH applic
ntation. Data quality system and process design. Analysis of requirements and
6 -production
8 years of specifications.
Hyderabad Documentation of solutio
sets Technical Experience Understanding of Windows operating system Experience in installing,
6 - 8 years
configuring
Hyderabad
and supporting Alteryx components in the
sets Technical Experience Understanding of Windows operating system Experience in installing,
4 - 6 years
configuring
Hyderabad
and supporting Alteryx components in the
nts such as Rollup, Scan, join, Partition by key, Partition by Round Robin, Gather, Merge, 4Interleave,
- 6 years Lookup,Gurgaon
etc. Experience in designing and deliveri
nts such as Rollup, Scan, join, Partition by key, Partition by Round Robin, Gather, Merge, 4Interleave,
- 6 years Lookup,Gurgaon
etc. Experience in designing and deliveri
nts such as Rollup, Scan, join, Partition by key, Partition by Round Robin, Gather, Merge, 4Interleave,
- 6 years Lookup,Gurgaon
etc. Experience in designing and deliveri
nts such as Rollup, Scan, join, Partition by key, Partition by Round Robin, Gather, Merge, 2Interleave,
- 4 years Lookup,Gurgaon
etc. Experience in designing and deliveri
nts such as Rollup, Scan, join, Partition by key, Partition by Round Robin, Gather, Merge, 2Interleave,
- 4 years Lookup,Gurgaon
etc. Experience in designing and deliveri
essing workflows using Hadoop and frameworks such as Spark and Cloudera Hands-on experience
6 - 8 yearson broadcasting
Anytools like Kafka, Event Hub, AWS K
essing workflows using Hadoop and frameworks such as Spark and Cloudera Hands-on experience
6 - 8 yearson broadcasting
Anytools like Kafka, Event Hub, AWS K
essing workflows using Hadoop and frameworks such as Spark and Cloudera Hands-on experience
4 - 6 yearson broadcasting
Anytools like Kafka, Event Hub, AWS K
essing workflows using Hadoop and frameworks such as Spark and Cloudera Hands-on experience
4 - 6 yearson broadcasting
Anytools like Kafka, Event Hub, AWS K
essing workflows using Hadoop and frameworks such as Spark and Cloudera Hands-on experience
4 - 6 yearson broadcasting
Anytools like Kafka, Event Hub, AWS K
essing workflows using Hadoop and frameworks such as Spark and Cloudera Hands-on experience
4 - 6 yearson broadcasting
Anytools like Kafka, Event Hub, AWS K
essing workflows using Hadoop and frameworks such as Spark and Cloudera Hands-on experience
4 - 6 yearson broadcasting
Anytools like Kafka, Event Hub, AWS K
essing workflows using Hadoop and frameworks such as Spark and Cloudera Hands-on experience
4 - 6 yearson broadcasting
Anytools like Kafka, Event Hub, AWS K
essing workflows using Hadoop and frameworks such as Spark and Cloudera Hands-on experience
4 - 6 yearson broadcasting
Anytools like Kafka, Event Hub, AWS K
essing workflows using Hadoop and frameworks such as Spark and Cloudera Hands-on experience
4 - 6 yearson broadcasting
Anytools like Kafka, Event Hub, AWS K
essing workflows using Hadoop and frameworks such as Spark and Cloudera Hands-on experience
4 - 6 yearson broadcasting
Anytools like Kafka, Event Hub, AWS K
essing workflows using Hadoop and frameworks such as Spark and Cloudera Hands-on experience
4 - 6 yearson broadcasting
Anytools like Kafka, Event Hub, AWS K
t understanding of various RDBMS - Should have Good communication skills Performing4issue
- 6 years
analysis & providing
Hyderabadsolution on Informatica DIH applic
t understanding of various RDBMS - Should have Good communication skills Performing4issue
- 6 years
analysis & providing
Hyderabadsolution on Informatica DIH applic
ntation. Data quality system and process design. Analysis of requirements and
6 -production
8 years of specifications.
Hyderabad Documentation of solutio
sets Technical Experience Understanding of Windows operating system Experience in installing,
6 - 8 years
configuring
Hyderabad
and supporting Alteryx components in the
sets Technical Experience Understanding of Windows operating system Experience in installing,
4 - 6 years
configuring
Hyderabad
and supporting Alteryx components in the
nts such as Rollup, Scan, join, Partition by key, Partition by Round Robin, Gather, Merge, 4Interleave,
- 6 years Lookup,Gurgaon
etc. Experience in designing and deliveri
nts such as Rollup, Scan, join, Partition by key, Partition by Round Robin, Gather, Merge, 4Interleave,
- 6 years Lookup,Gurgaon
etc. Experience in designing and deliveri
nts such as Rollup, Scan, join, Partition by key, Partition by Round Robin, Gather, Merge, 4Interleave,
- 6 years Lookup,Gurgaon
etc. Experience in designing and deliveri
nts such as Rollup, Scan, join, Partition by key, Partition by Round Robin, Gather, Merge, 2Interleave,
- 4 years Lookup,Gurgaon
etc. Experience in designing and deliveri
nts such as Rollup, Scan, join, Partition by key, Partition by Round Robin, Gather, Merge, 2Interleave,
- 4 years Lookup,Gurgaon
etc. Experience in designing and deliveri
Should have strong hands on exp in ETL Datastage Development 4 - 6 years Bangalore
erned, and accurate, and in standardised units, makes sure data remains clean throughout lifecycle
8 - 10 years
Minimum 5Bangalore
years experience, 2+ years in role.
4 - 6 years Pune
2 - 4 years Pune
4 - 6 years Gurgaon
ping analytical solutions and building analytical models in at least 3 projects, where at least2 2- 4ofyears
such projects Gurgaon
involve Big Data and Unstructured Data An
source shall have worked on at least 2 projects involving Text Analytics using the solution2proposed
- 4 years by BISP Delhi
MCA degree c. Experience: The resource shall possess a minimum of 5 years of experience4in
- 6IT,
years
with at least 2Delhi
years experience as HDFS Administrato
rtaining to reporting and visualization using cognos. The resources shall have worked in at2 least
- 4 years
two projects where
Delhithey were involved in developing rep
t certifications.
se for at least 3 c. Experience:
projects The resource
and should should
have at least onehave a minimum
project of 5ofyears
experience of experience
Data Modeling 4 - 6performance
and of
years
working with Delhi
advanced
tuning BI Solution
for a Data andwith
Warehouse a minimum of 23
more than
onversant in English and should have excellent writing, MIS, communication, time management 6 - 8 years
and multi-tasking
Delhi
skills
Recruiter Name
Sheela Narendrarnath/India/Contr/IBM
Arun K Venkatesan2/India/IBM
Narotham Maudghal12/India/IBM
Reema Pk11/India/IBM
Narotham Maudghal12/India/IBM
Naqheeba Shaik/India/Contr/IBM
Reema Pk11/India/IBM
Narotham Maudghal12/India/IBM
Naqheeba Shaik/India/Contr/IBM
Amruta Pujari/India/Contr/IBM
Narotham Maudghal12/India/IBM
Gayatri Prasad/India/Contr/IBM
Narotham Maudghal12/India/IBM
Kiran Shelka/India/Contr/IBM
Chirayu Bapat/India/IBM
Chirayu Bapat/India/IBM
Agasthiyan R/India/Contr/IBM
Naveen Palaparthi/India/Contr/IBM
Shashwathi Nachappa/India/IBM
Sameer Ahammed/India/IBM
Agasthiyan R/India/Contr/IBM
Sameer Ahammed/India/IBM
Chirayu Bapat/India/IBM
Agasthiyan R/India/Contr/IBM
Agasthiyan R/India/Contr/IBM
Agasthiyan R/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Agasthiyan R/India/Contr/IBM
S Naqueba Sultana/India/IBM
Priyanka Sapru Bhat/India/IBM
Mohammad K Babu/India/Contr/IBM
Agasthiyan R/India/Contr/IBM
Priyanka Sapru Bhat/India/IBM
Mohammad K Babu/India/Contr/IBM
Agasthiyan R/India/Contr/IBM
S Naqueba Sultana/India/IBM
S Naqueba Sultana/India/IBM
Agasthiyan R/India/Contr/IBM
Mohammad K Babu/India/Contr/IBM
Chirayu Bapat/India/IBM
Chirayu Bapat/India/IBM
Bharath D3/India/IBM
Naveen Palaparthi/India/Contr/IBM
Naveen Palaparthi/India/Contr/IBM
Abubakar Nadaf2/India/IBM
Aldrin Dabreo/India/Contr/IBM
Priyanka Sapru Bhat/India/IBM
Naveen Palaparthi/India/Contr/IBM
Shruthi Abhinav/India/Contr/IBM
Naveen Palaparthi/India/Contr/IBM
Naveen Palaparthi/India/Contr/IBM
Abubakar Nadaf2/India/IBM
Shruthi Abhinav/India/Contr/IBM
Amruta Sunil Pujari/India/IBM
Amruta Sunil Pujari/India/IBM
Amruta Sunil Pujari/India/IBM
Amruta Sunil Pujari/India/IBM
Amruta Sunil Pujari/India/IBM
Amruta Sunil Pujari/India/IBM
Amruta Sunil Pujari/India/IBM
Aldrin Dabreo/India/Contr/IBM
Sai Rohini Edla/India/IBM
Abubakar Nadaf2/India/IBM
G Raghavendar Goud/India/IBM
Priyanka Sapru Bhat/India/IBM
Shashwathi Nachappa/India/IBM
Abubakar Nadaf2/India/IBM
Abubakar Nadaf2/India/IBM
Abubakar Nadaf2/India/IBM
Abubakar Nadaf2/India/IBM
Abubakar Nadaf2/India/IBM
Abubakar Nadaf2/India/IBM
Abubakar Nadaf2/India/IBM
Abubakar Nadaf2/India/IBM
Nurul H/India/Contr/IBM
Nurul H/India/Contr/IBM
Shashwathi Nachappa/India/IBM
Nurul H/India/Contr/IBM
Nurul H/India/Contr/IBM
Bharath D3/India/IBM
Bharath D3/India/IBM
Bharath D3/India/IBM
Bharath D3/India/IBM
Chirayu Bapat/India/IBM
Sowmya Margaret1/India/IBM
Soham Chakraborty1/India/IBM
Mohammad K Babu/India/Contr/IBM
Mohammad K Babu/India/Contr/IBM
Mohammad K Babu/India/Contr/IBM
Mohammad K Babu/India/Contr/IBM
Mohammad K Babu/India/Contr/IBM
Mohammad K Babu/India/Contr/IBM
Mohammad K Babu/India/Contr/IBM
Mohammad K Babu/India/Contr/IBM
Mohammad K Babu/India/Contr/IBM
Mohammad K Babu/India/Contr/IBM
Mohammad K Babu/India/Contr/IBM
Mohammad K Babu/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Naveen Palaparthi/India/Contr/IBM
Shruthi Abhinav/India/Contr/IBM
Maya C V/India/IBM
Nurul H/India/Contr/IBM
Abubakar Nadaf2/India/IBM
Nurul H/India/Contr/IBM
Shruthi Abhinav/India/Contr/IBM
Leon J Basco1/India/IBM
Leon J Basco1/India/IBM
Bharath D3/India/IBM
Sai Rohini Edla/India/IBM
Dhanapati Naorem/India/IBM
Dhanapati Naorem/India/IBM
Shashwathi Nachappa/India/IBM
Sakshi Priya1/India/IBM
Dhanapati Naorem/India/IBM
Dhanapati Naorem/India/IBM
Sai Rohini Edla/India/IBM
Sai Rohini Edla/India/IBM
Dhanapati Naorem/India/IBM
Dhanapati Naorem/India/IBM
Amruta Sunil Pujari/India/IBM
Dhanapati Naorem/India/IBM
Sai Rohini Edla/India/IBM
Priya Murugesan/India/Contr/IBM
Dhanapati Naorem/India/IBM
S Naqueba Sultana/India/IBM
Dhanapati Naorem/India/IBM
S Naqueba Sultana/India/IBM
Dhanapati Naorem/India/IBM
S Naqueba Sultana/India/IBM
Niharika Bora/India/IBM
Dhanapati Naorem/India/IBM
Dhanapati Naorem/India/IBM
Trupti Chundawat/India/IBM
Sowmya Margaret1/India/IBM
Sowmya Margaret1/India/IBM
Sowmya Margaret1/India/IBM
Sameer Ahammed/India/IBM
Sameer Ahammed/India/IBM
Sameer Ahammed/India/IBM
Shruthi Abhinav/India/Contr/IBM
Shruthi Abhinav/India/Contr/IBM
Poonam Wanave/India/IBM
Shruthi Abhinav/India/Contr/IBM
Shruthi Abhinav/India/Contr/IBM
Shruthi Abhinav/India/Contr/IBM
Poonam Wanave/India/IBM
Poonam Wanave/India/IBM
Priya Murugesan/India/Contr/IBM
Maya C V/India/IBM
Maya C V/India/IBM
Nurul H/India/Contr/IBM
Nurul H/India/Contr/IBM
Nurul H/India/Contr/IBM
Nurul H/India/Contr/IBM
Nurul H/India/Contr/IBM
Maya C V/India/IBM
Bharath D3/India/IBM
G Raghavendar Goud/India/IBM
Bharath D3/India/IBM
Shivani Shrivastava1/India/Contr/IBM
Shivani Shrivastava1/India/Contr/IBM
Shivani Shrivastava1/India/Contr/IBM
Santosh kumar Chitta2/India/IBM
Santosh kumar Chitta2/India/IBM
Santosh kumar Chitta2/India/IBM
Bharath D3/India/IBM
Bharath D3/India/IBM
Bharath D3/India/IBM
Dhanapati Naorem/India/IBM
Amruta Sunil Pujari/India/IBM
V Sharat Nag/India/IBM
V Sharat Nag/India/IBM
Jaskaran Singh42/India/IBM
Srinka Kundu/India/Contr/IBM
Amruta Sunil Pujari/India/IBM
Amruta Sunil Pujari/India/IBM
Amruta Sunil Pujari/India/IBM
Jaskaran Singh42/India/IBM
Jaskaran Singh42/India/IBM
Bharath D3/India/IBM
Bharath D3/India/IBM
Bharath D3/India/IBM
Leon J Basco1/India/IBM
Leon J Basco1/India/IBM
Leon J Basco1/India/IBM
Leon J Basco1/India/IBM
Leon J Basco1/India/IBM
Leon J Basco1/India/IBM
Leon J Basco1/India/IBM
Leon J Basco1/India/IBM
Leon J Basco1/India/IBM
Leon J Basco1/India/IBM
Leon J Basco1/India/IBM
Leon J Basco1/India/IBM
Kiran Shelka/India/Contr/IBM
Leon J Basco1/India/IBM
Leon J Basco1/India/IBM
Naveen Palaparthi/India/Contr/IBM
Naveen Palaparthi/India/Contr/IBM
Naveen Palaparthi/India/Contr/IBM
Naveen Palaparthi/India/Contr/IBM
Naveen Palaparthi/India/Contr/IBM
Naveen Palaparthi/India/Contr/IBM
Naveen Palaparthi/India/Contr/IBM
Nurul H/India/Contr/IBM
Nurul H/India/Contr/IBM
Nurul H/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Sakshi Priya1/India/IBM
Sakshi Priya1/India/IBM
Sakshi Priya1/India/IBM
Sakshi Priya1/India/IBM
Sakshi Priya1/India/IBM
Sakshi Priya1/India/IBM
Sakshi Priya1/India/IBM
Sakshi Priya1/India/IBM
Sakshi Priya1/India/IBM
Sakshi Priya1/India/IBM
Kiran Shelka/India/Contr/IBM
Prashant Kamble/India/Contr/IBM
Chirayu Bapat/India/IBM
Chirayu Bapat/India/IBM
Chirayu Bapat/India/IBM
Chirayu Bapat/India/IBM
Chirayu Bapat/India/IBM
Chirayu Bapat/India/IBM
Chirayu Bapat/India/IBM
Chirayu Bapat/India/IBM
Chirayu Bapat/India/IBM
Chirayu Bapat/India/IBM
Sakshi Priya1/India/IBM
Sakshi Priya1/India/IBM
Sakshi Priya1/India/IBM
Sakshi Priya1/India/IBM
Sakshi Priya1/India/IBM
Sakshi Priya1/India/IBM
Sakshi Priya1/India/IBM
Sakshi Priya1/India/IBM
Sakshi Priya1/India/IBM
Sameeshra Salunke/India/IBM
Sameeshra Salunke/India/IBM
Sameeshra Salunke/India/IBM
Sameeshra Salunke/India/IBM
Sameeshra Salunke/India/IBM
Sameeshra Salunke/India/IBM
Sameeshra Salunke/India/IBM
Sameeshra Salunke/India/IBM
Sameeshra Salunke/India/IBM
Sameeshra Salunke/India/IBM
Sameeshra Salunke/India/IBM
Sameeshra Salunke/India/IBM
Sameeshra Salunke/India/IBM
Sameeshra Salunke/India/IBM
Sameeshra Salunke/India/IBM
Sameeshra Salunke/India/IBM
Sameeshra Salunke/India/IBM
Sameeshra Salunke/India/IBM
Sameeshra Salunke/India/IBM
Sameeshra Salunke/India/IBM
Sameeshra Salunke/India/IBM
Sameeshra Salunke/India/IBM
Sameeshra Salunke/India/IBM
Shashwathi Nachappa/India/IBM
Chirayu Bapat/India/IBM
Abubakar Nadaf2/India/IBM
Abubakar Nadaf2/India/IBM
Abubakar Nadaf2/India/IBM
Abubakar Nadaf2/India/IBM
Abubakar Nadaf2/India/IBM
Abubakar Nadaf2/India/IBM
Abubakar Nadaf2/India/IBM
Abubakar Nadaf2/India/IBM
Abubakar Nadaf2/India/IBM
Amruta Sunil Pujari/India/IBM
Chirayu Bapat/India/IBM
Amruta Sunil Pujari/India/IBM
Amruta Sunil Pujari/India/IBM
Amruta Sunil Pujari/India/IBM
Amruta Sunil Pujari/India/IBM
Amruta Sunil Pujari/India/IBM
Amruta Sunil Pujari/India/IBM
Amruta Sunil Pujari/India/IBM
Amruta Sunil Pujari/India/IBM
Naveen Palaparthi/India/Contr/IBM
Naveen Palaparthi/India/Contr/IBM
Naveen Palaparthi/India/Contr/IBM
Naveen Palaparthi/India/Contr/IBM
Naveen Palaparthi/India/Contr/IBM
Naveen Palaparthi/India/Contr/IBM
Naveen Palaparthi/India/Contr/IBM
Naveen Palaparthi/India/Contr/IBM
Naveen Palaparthi/India/Contr/IBM
Nurul H/India/Contr/IBM
Nurul H/India/Contr/IBM
Nurul H/India/Contr/IBM
Nurul H/India/Contr/IBM
Nurul H/India/Contr/IBM
Nurul H/India/Contr/IBM
Nurul H/India/Contr/IBM
S Naqueba Sultana/India/IBM
S Naqueba Sultana/India/IBM
S Naqueba Sultana/India/IBM
S Naqueba Sultana/India/IBM
S Naqueba Sultana/India/IBM
S Naqueba Sultana/India/IBM
Mohammad K Babu/India/Contr/IBM
Chirayu Bapat/India/IBM
Nurul H/India/Contr/IBM
Sameer Ahammed/India/IBM
Naveen Palaparthi/India/Contr/IBM
Bharath D3/India/IBM
S Naqueba Sultana/India/IBM
Sakshi Priya1/India/IBM
Amruta Sunil Pujari/India/IBM
Leon J Basco1/India/IBM
Shruthi Abhinav/India/Contr/IBM
Bharath D3/India/IBM
Bharath D3/India/IBM
Chirayu Bapat/India/IBM
Chirayu Bapat/India/IBM
Sakshi Priya1/India/IBM
Amruta Sunil Pujari/India/IBM
Leon J Basco1/India/IBM
Chirayu Bapat/India/IBM
Shruthi Abhinav/India/Contr/IBM
Sameer Ahammed/India/IBM
Naveen Palaparthi/India/Contr/IBM
Rajeshwari P/India/IBM
Abubakar Nadaf2/India/IBM
Priya Murugesan/India/Contr/IBM
Sakshi Priya1/India/IBM
Mohammad K Babu/India/Contr/IBM
Nurul H/India/Contr/IBM
Naveen Palaparthi/India/Contr/IBM
Rajeshwari P/India/IBM
Abubakar Nadaf2/India/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Sakshi Priya1/India/IBM
Bharath D3/India/IBM
S Naqueba Sultana/India/IBM
Sakshi Priya1/India/IBM
Amruta Sunil Pujari/India/IBM
Leon J Basco1/India/IBM
Shruthi Abhinav/India/Contr/IBM
Mohammad K Babu/India/Contr/IBM
Nurul H/India/Contr/IBM
Sameer Ahammed/India/IBM
Naveen Palaparthi/India/Contr/IBM
Anjali Kumari2/India/Contr/IBM
Anjali Kumari2/India/Contr/IBM
Anjali Kumari2/India/Contr/IBM
Anjali Kumari2/India/Contr/IBM
Abubakar Nadaf2/India/IBM
Shashwathi Nachappa/India/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Sakshi Priya1/India/IBM
Amruta Sunil Pujari/India/IBM
Bharath D3/India/IBM
Shashwathi Nachappa/India/IBM
S Naqueba Sultana/India/IBM
Sakshi Priya1/India/IBM
Amruta Sunil Pujari/India/IBM
Leon J Basco1/India/IBM
Shruthi Abhinav/India/Contr/IBM
Nitika Kumari/India/Contr/IBM
Mohammad K Babu/India/Contr/IBM
Nurul H/India/Contr/IBM
Sameer Ahammed/India/IBM
Naveen Palaparthi/India/Contr/IBM
Rajeshwari P/India/IBM
Chirayu Bapat/India/IBM
Chirayu Bapat/India/IBM
Bharath D3/India/IBM
Shashwathi Nachappa/India/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Amruta Sunil Pujari/India/IBM
Rajeshwari P/India/IBM
Bharath D3/India/IBM
Dhanapati Naorem/India/IBM
Dhanapati Naorem/India/IBM
Dhanapati Naorem/India/IBM
Dhanapati Naorem/India/IBM
Dhanapati Naorem/India/IBM
Dhanapati Naorem/India/IBM
Ramkumar Venugopal/India/IBM
Busupalli L Narayana/India/IBM
Busupalli L Narayana/India/IBM
Busupalli L Narayana/India/IBM
Busupalli L Narayana/India/IBM
Bharath D3/India/IBM
Bharath D3/India/IBM
Bharath D3/India/IBM
Busupalli L Narayana/India/IBM
Bharath D3/India/IBM
Bharath D3/India/IBM
Bharath D3/India/IBM
Bharath D3/India/IBM
Busupalli L Narayana/India/IBM
Kiran Shelka/India/Contr/IBM
Kiran Shelka/India/Contr/IBM
Kiran Shelka/India/Contr/IBM
Kiran Shelka/India/Contr/IBM
Kiran Shelka/India/Contr/IBM
Kiran Shelka/India/Contr/IBM
Kiran Shelka/India/Contr/IBM
Kiran Shelka/India/Contr/IBM
Kiran Shelka/India/Contr/IBM
Kiran Shelka/India/Contr/IBM
Kiran Shelka/India/Contr/IBM
Kiran Shelka/India/Contr/IBM
Kiran Shelka/India/Contr/IBM
Kiran Shelka/India/Contr/IBM
Kiran Shelka/India/Contr/IBM
Kiran Shelka/India/Contr/IBM
Kiran Shelka/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Shikha Jadon/India/Contr/IBM
Chirayu Bapat/India/IBM
Chirayu Bapat/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Chirayu Bapat/India/IBM
Chirayu Bapat/India/IBM
Chirayu Bapat/India/IBM
Chirayu Bapat/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Amrutha M S/India/IBM
Rajeshwari P/India/IBM
Chirayu Bapat/India/IBM
Chirayu Bapat/India/IBM
Chirayu Bapat/India/IBM
Chirayu Bapat/India/IBM
Chirayu Bapat/India/IBM
Chirayu Bapat/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Chirayu Bapat/India/IBM
Rajeshwari P/India/IBM
Chirayu Bapat/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
Rajeshwari P/India/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Sameer Ahammed/India/IBM
Sameeshra Salunke/India/IBM
Shashwathi Nachappa/India/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Sameer Ahammed/India/IBM
Shashwathi Nachappa/India/IBM
Shruthi Abhinav/India/Contr/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Shashwathi Nachappa/India/IBM
Shruthi Abhinav/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Sameer Ahammed/India/IBM
Sameeshra Salunke/India/IBM
Shashwathi Nachappa/India/IBM
Shruthi Abhinav/India/Contr/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Sameer Ahammed/India/IBM
Sameeshra Salunke/India/IBM
Shashwathi Nachappa/India/IBM
Shruthi Abhinav/India/Contr/IBM
Sameer Ahammed/India/IBM
Shashwathi Nachappa/India/IBM
Shruthi Abhinav/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Sameer Ahammed/India/IBM
Sameeshra Salunke/India/IBM
Shashwathi Nachappa/India/IBM
Priya Murugesan/India/Contr/IBM
Sameer Ahammed/India/IBM
Sameeshra Salunke/India/IBM
S Naqueba Sultana/India/IBM
Sameer Ahammed/India/IBM
Sameeshra Salunke/India/IBM
Shashwathi Nachappa/India/IBM
Shruthi Abhinav/India/Contr/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Sameer Ahammed/India/IBM
Sameeshra Salunke/India/IBM
Shashwathi Nachappa/India/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Sameer Ahammed/India/IBM
Shashwathi Nachappa/India/IBM
Shruthi Abhinav/India/Contr/IBM
S Naqueba Sultana/India/IBM
Shashwathi Nachappa/India/IBM
Shruthi Abhinav/India/Contr/IBM
S Naqueba Sultana/India/IBM
Sameer Ahammed/India/IBM
Sameeshra Salunke/India/IBM
Shashwathi Nachappa/India/IBM
Shruthi Abhinav/India/Contr/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Sameer Ahammed/India/IBM
Sameeshra Salunke/India/IBM
Shashwathi Nachappa/India/IBM
Shruthi Abhinav/India/Contr/IBM
S Naqueba Sultana/India/IBM
Sameer Ahammed/India/IBM
Sameeshra Salunke/India/IBM
Shruthi Abhinav/India/Contr/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Sameeshra Salunke/India/IBM
Shashwathi Nachappa/India/IBM
Shruthi Abhinav/India/Contr/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Sameer Ahammed/India/IBM
Sameeshra Salunke/India/IBM
S Naqueba Sultana/India/IBM
Sameeshra Salunke/India/IBM
S Naqueba Sultana/India/IBM
Sameeshra Salunke/India/IBM
Shashwathi Nachappa/India/IBM
Shruthi Abhinav/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Sameer Ahammed/India/IBM
Sameeshra Salunke/India/IBM
Shruthi Abhinav/India/Contr/IBM
S Naqueba Sultana/India/IBM
Nitika Kumari/India/Contr/IBM
Simron Lepcha/India/Contr/IBM
Shivani Shrivastava1/India/Contr/IBM
Shivani Shrivastava1/India/Contr/IBM
Shivani Shrivastava1/India/Contr/IBM
Shivani Shrivastava1/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
S Naqueba Sultana/India/IBM
Shashwathi Nachappa/India/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Sameeshra Salunke/India/IBM
Shashwathi Nachappa/India/IBM
Shruthi Abhinav/India/Contr/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Chirayu Bapat/India/IBM
Sameer Ahammed/India/IBM
Sameeshra Salunke/India/IBM
Shashwathi Nachappa/India/IBM
Shruthi Abhinav/India/Contr/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Sameer Ahammed/India/IBM
Sameeshra Salunke/India/IBM
Shashwathi Nachappa/India/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Shashwathi Nachappa/India/IBM
Priya Murugesan/India/Contr/IBM
Sameer Ahammed/India/IBM
Shruthi Abhinav/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Sameer Ahammed/India/IBM
Shashwathi Nachappa/India/IBM
Shruthi Abhinav/India/Contr/IBM
Shashwathi Nachappa/India/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Sameer Ahammed/India/IBM
Shashwathi Nachappa/India/IBM
Shruthi Abhinav/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Sameeshra Salunke/India/IBM
Shashwathi Nachappa/India/IBM
Shruthi Abhinav/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Sameer Ahammed/India/IBM
Syed Asif1/India/Contr/IBM
Syed Asif1/India/Contr/IBM
Syed Asif1/India/Contr/IBM
Syed Asif1/India/Contr/IBM
Syed Asif1/India/Contr/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Sameeshra Salunke/India/IBM
Priya Murugesan/India/Contr/IBM
Sameeshra Salunke/India/IBM
Shruthi Abhinav/India/Contr/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Sameeshra Salunke/India/IBM
Shruthi Abhinav/India/Contr/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Sameer Ahammed/India/IBM
Sameeshra Salunke/India/IBM
Shashwathi Nachappa/India/IBM
Shruthi Abhinav/India/Contr/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Sameer Ahammed/India/IBM
Sameeshra Salunke/India/IBM
Shashwathi Nachappa/India/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Sameer Ahammed/India/IBM
Shashwathi Nachappa/India/IBM
Shruthi Abhinav/India/Contr/IBM
S Naqueba Sultana/India/IBM
Sameer Ahammed/India/IBM
Sameeshra Salunke/India/IBM
Shashwathi Nachappa/India/IBM
Shruthi Abhinav/India/Contr/IBM
S Naqueba Sultana/India/IBM
Sameeshra Salunke/India/IBM
Shashwathi Nachappa/India/IBM
Shruthi Abhinav/India/Contr/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
S Naqueba Sultana/India/IBM
Sameer Ahammed/India/IBM
Sameeshra Salunke/India/IBM
Shruthi Abhinav/India/Contr/IBM
Shruthi Abhinav/India/Contr/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Sameer Ahammed/India/IBM
Shashwathi Nachappa/India/IBM
Shruthi Abhinav/India/Contr/IBM
S Naqueba Sultana/India/IBM
Priya Murugesan/India/Contr/IBM
Sameer Ahammed/India/IBM
Sameeshra Salunke/India/IBM
Shashwathi Nachappa/India/IBM
Shruthi Abhinav/India/Contr/IBM
Rajeshwari P/India/IBM
Shruthi Abhinav/India/Contr/IBM
Shruthi Abhinav/India/Contr/IBM
Amruta Sunil Pujari/India/IBM
Amruta Sunil Pujari/India/IBM
Rajeshwari P/India/IBM
Sameer Ahammed/India/IBM
Amruta Sunil Pujari/India/IBM
Priya Murugesan/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Sameer Ahammed/India/IBM
Sameer Ahammed/India/IBM
Priya Murugesan/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Amruta Sunil Pujari/India/IBM@IBM
Amruta Sunil Pujari/India/IBM
Sakshi Singh6/India/IBM
Anjali Kumari2/India/Contr/IBM
Anjali Kumari2/India/Contr/IBM
Anjali Kumari2/India/Contr/IBM
Anjali Kumari2/India/Contr/IBM
Anjali Kumari2/India/Contr/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Nidhi Verma1/India/IBM
Nidhi Verma1/India/IBM
Nidhi Verma1/India/IBM
Nidhi Verma1/India/IBM
Bharath D3/India/IBM
Priya Murugesan/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Sameer Ahammed/India/IBM
Sameer Ahammed/India/IBM
Sameer Ahammed/India/IBM
Priya Murugesan/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Priya Murugesan/India/Contr/IBM
Rajeshwari P/India/IBM
Anjali Kumari2/India/Contr/IBM
Anjali Kumari2/India/Contr/IBM
Anjali Kumari2/India/Contr/IBM
Anjali Kumari2/India/Contr/IBM
Anjali Kumari2/India/Contr/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Nidhi Verma1/India/IBM
Nidhi Verma1/India/IBM
Nidhi Verma1/India/IBM
Nidhi Verma1/India/IBM
Nidhi Verma1/India/IBM
Kumaravel Subbarayan/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Nidhi Verma1/India/IBM
Nidhi Verma1/India/IBM
Kumaravel Subbarayan/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Nidhi Verma1/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Jaskaran Singh42/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Kumaravel Subbarayan/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Jaskaran Singh42/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Vinaya Mandlik01/India/IBM
Soni Deshmukh1/India/IBM
Chirayu Bapat/India/IBM
Sameer Ahammed/India/IBM
Rajeshwari P/India/IBM
Sinimol Koshy/India/IBM
Rajdeep Nag/India/IBM
Bharath D3/India/IBM
Santosh kumar Chitta2/India/IBM
Priya Murugesan/India/Contr/IBM
Sakshi Priya1/India/IBM
Reema Pk11/India/IBM

You might also like