You are on page 1of 18

Digital Energy 2008, Houston TW0808

SPE GCS1 Digital Energy


Houston, May 2008

Don Paul (Chevron), Washington Salles (Petrobras), Don Moore (OXY) and Steve Fortune (BP)

The Society of Petroleum Engineering Houston Digital Oilfield conference and exhibition is an altogether
lower key affair compared to its European ‘Intelligent Energy’ counterpart – reflecting its origins with the
SPE’s local Gulf Coast chapter.
Previous SPE-supported events have often touted the idea that the oil industry is a technology ‘laggard.’ Not
so according to Don Paul (Chevron) who cited a recent study by the US Council on Competitiveness2 that
found that Tier 1 US energy companies outpaced other sectors. Oil and gas has embedded IT ‘all the way to
the front line of the business.’ Paul also analyzed the current supply and demand situation to conclude that
demand growth is likely to continue apace – and that for the industry, ‘this is as good as it gets.’ The
technology lead theme was largely supported in Donna Crawford’s (LLNL3) keynote on high performance
computing.
A couple of interesting panel sessions debated the state of the art. Tony Edwards (BG) saw ‘confusion’ as to
what should be standardized, outsources and automated. Steve Fortune (BP) ventured that information
management was more important than the digital oilfield. An interesting observation in so far as others, Don
Paul included, considered the data management problem as more or less intractable. Fortune also noted that
the digital oilfield was moving from pilots into ‘full scale, value generating deployments.’
Otherwise the papers presented showed considerable diversity around the digital oilfield theme. Topics
included Shell’s Gulf of Mexico production operations management center, AspenTech’s work on
optimizing production on BP’s Azeri field, Shell’s gas lift optimization on Brunei’s Champion field and
rock mechanics modeling from Chevron. An interesting gathering of smaller vendors was showing software
for workflow management, collaboration and visualization.
Many exhibitors were showing off Chevron deployments. We’re not sure if this means that Chevron is
ahead in the digital oilfield stakes or just more communicative. Either way, Chevron’s openness is to be
encouraged.

Highlights

Keynote – Don Paul, Chevron


Global IHS data available in Google Earth
Shell’s GOM Production Operations Center
State of the art HPC – Lawrence Livermore
Schlumberger’s ‘Blue’ digital oilfield
Optimizing BP’s Azeri field
Gas lift on Shell Brunei Champion field
CIO Panel discussion

1
Gulf Coast Section.
2
www.compete.org.
3
Lawrence Livermore National Laboratory.

Technology Watch Report 1 © 2008 The Data Room


Digital Energy 2008, Houston TW0808

Contents

TW0808_1 Keynote – Don Paul (Chevron).................................................................................................................. 3


TW0808_2 Next Generation Production Surveillance - Tom Moroney, Shell ............................................................. 4
TW0808_3 Panel Session ............................................................................................................................................. 4
TW0808_4 Workflows in digital oilfields – Anil Pande, Infosys................................................................................ 4
TW0808_5 Energy web services – James Sanders, IHS.............................................................................................. 5
TW0808_6 Integrated Information Framework – David Haake, IBM.......................................................................... 5
TW0808_7 Keynote – High performance computing – Donna Crawford, LLNL ........................................................ 6
TW0808_8 Keynote - John Gibson, Paradigm ............................................................................................................. 6
TW0808_9 The ‘blue’ digital oilfield – Ashok Belani, Schlumberger ......................................................................... 6
TW0808_10 BP’s Azeri Field Optimizer – Sergi Sama, AspenTech ........................................................................ 7
TW0808_11 Shell Brunei Champion field gas lift optimization – Ron Cramer, Shell Global Solutions .................. 7
TW0808_12 IT Innovations Panel Session ................................................................................................................ 8
0808_12.1 Rick Nicholson, Energy Insights.................................................................................................. 8
0808_12.2 Mike Sternskey, Microsoft ........................................................................................................... 8
0808_12.3 David Shimbo, Oracle ................................................................................................................. 8
0808_12.4 Katya Casey, BHPB..................................................................................................................... 8
TW0808_13 GIS based HSE portal – Vineet Lasrado, Infosys ................................................................................. 8
TW0808_14 Rock mechanics modeling – Peter Conolly, Chevron........................................................................... 8
TW0808_15 Panel discussion .................................................................................................................................... 9
TW0808_16 Exhibitors............................................................................................................................................ 11
0808_16.1 CISCO ‘NERV’ Network Emergency Response Vehicle............................................................ 11
0808_16.2 Software Innovation - Collaboration for OOs and EPCs (Chevron)......................................... 12
0808_16.3 Credant Technologies – Mobile Guardian for securing sensitive data ..................................... 12
0808_16.4 EPSIS Real Time Assistant (ERA) in Chevron’s Master Schedule View ................................... 13
0808_16.5 Halliburton – AssetObserver, AssetConnect modeling and operating environment ................. 14
0808_16.6 HP – Advanced Virtual Collaboration for Oil and Gas ‘open standard’.................................. 15
0808_16.7 IMPAC Systems search and knowledge management placemat................................................ 15
0808_16.8 Infonic – Geo-Replicator – enterprise scale replication for Microsoft SharePoint................... 15
0808_16.9 IOCOM – (InSors) Grid collaboration room software (Chevron) ............................................ 16
0808_16.10 Optelligent Solutions – oilfield data mining and visualization.................................................. 16
0808_16.11 P2 Energy Solutions – Enterprise Land .................................................................................... 17
0808_16.12 Schlumberger – Credentus enterprise credential manager ....................................................... 17
0808_16.13 QuickWells – Smart well design and procurement system ........................................................ 17
TW0808_17 The Data Room – Technology Watch subscription information......................................................... 18

The Data Room – Technology Watch subscription information


This report has been produced as part of The Data Room’s Technology Watch reporting service. For more
on this subscription-based service please visit the Technology Watch home page or email tw@oilit.com.

© June 2008
The Data Room
7 rue des Verrieres
F-92310 Sevres France
Tel (USA) 281 968 0752
Tel (UK) 020 7193 1489
Tel (France) +33 1 4623 9596
Fax +33 1 4623 0652
Technology Watch Home Page info@oilit.com

Technology Watch Report 2 © 2008 The Data Room


Digital Energy 2008, Houston TW0808

TW0808_1 Keynote – Don Paul (Chevron)

Paul
Many outside the industry don’t understand that oil and gas is one of most ‘digitally intensive’ industries as
a recent study by the US Council on Competitiveness shows. Upstream technology is pulled by business
opportunities such as the deepwater. The industry has been successful because it never got confused -
business drives technology rather than the other way round. The report studied technical computing in the
automobile, aerospace, pharmaceutical and oil and gas industries finding that Tier 1 US energy companies
outpaced other sectors. The big difference is that oil and gas has embedded IT ‘all the way to the front line
of the business.’ In other words, opportunity pulls technology. The study can be downloaded from
www.compete.org/publications.
Paul then turned to the oil price (currently around $135 per barrel) to ask ‘What’s happened to supply and
demand?’ Demand (consumption) is forecast to increase 50-60% by 2030. A 30% increase equates to the
equivalent of Saudi Arabia plus Russia. Oil, gas and coal will dominate as far as the eye can see although,
‘We need to do as much alternatives as possible ...’ While we are not running out of molecules (of oil),
supply is constrained by politics and costs. ‘Conventional’ may not be able to deliver. According to the
IEA, some $20 trillion in energy capital expenditure is required over the next 25 years. This is significantly
higher than current rates which are clearly insufficient to meet the above needs. Total energy demand
forecast supports the above 35% growth even though the economy may slow down. Demand is bolstered by
the shift in OECD to non OECD countries – with a cross over happening around now. Today growth is
outside of the OECD – the whole demand train is being pulled by new actors. Paul cited the Tata Motors
‘Nano’ low cost auto, expected to be produced in hundreds of millions. All gasoline powered – which will
have a major impact on consumption.
Both conventional and non conventional resources rely on ‘digital energy.’ The biggest resource in the
world is arguably the Athabasca tar sands. These need better designed processes and systems. As computers
get faster new applications come along. Not long ago subsalt imaging was considered impossible. It is
feasible now thanks to ‘smart people plus computers that are 1,000 times faster.’
Paul stressed that for the industry, ‘this is as good as it gets.’ An FPSO can be viewed as the most complex
mobile manufacturing facility ever. This can’t be designed with pencil and paper or even CAD/CAM – it
needs high performance computing (HPC). The Tengiz sour gas injection plant was needed so as not to
produce another cubic kilometer of sulfur and is equally complex. CERA estimated development costs at
$20/bbl. There was a time when this would have been considered outrageous. The digital oilfield is also
about improving decisions with decision support centers, operations centers and collaboration centers. Other
industries (defense, intelligence) have similar complex integrations issues. Chevron’s 100 year old Kern
River field in California has been completely revived by technology. Today there are 6,000 trucks moving
around the field. Chevron uses ‘integrated’ 4D movies for real time planning (work from USC). This has
been used to mitigate risks, spills and safety incidents.
Paul sees current trends centered on connectivity, computing, engineering and manufacturing, human-digital
relationships, virtual worlds, automation and robotics (well controls). According to Ray Kurzweil, today a
$1000 buys a giga flop of compute power while the fastest machines approach the petaflop. In ten years, we
will have a petaflop for $1000. Chevron had 5,000 terabytes of storage 2007 – growing at an annual rate of
80% for technical and 60% for business data. ‘Intensity’ is rising constantly and requires advances in data
management and access technology. SCADA systems were not built with security in mind! An issue that is
being addressed by the US Department of Homeland Security’s ‘Lining the Oil and Gas Industry to Improve
Cyber Security’ (LOGIIC – www.logiic.org) program. Chevron, like the US Army, has a Second Life
simulation game to educate about energy.
Q&A
Shell – could you elaborate on digital technology and safety?
Technology Watch Report 3 © 2008 The Data Room
Digital Energy 2008, Houston TW0808
Fields such as predictive spatial intelligence can be used to monitor when things are going out of
spec. More measurement and integration produced better predictive models that allow us to
anticipate ‘bad stuff.’ Here refining is further along the curve.
OpenSpirit – What about the McKinsey study that portrayed the industry as reluctant to use new
technology?
I don’t buy the story. I’ve been a contrarian on this for a long time. We are not talking about
consumer software. Bad technology in process can be catastrophic. So we implement scalable and
sure technology. In view of the consequence of failure, the time frame for take-up is about what you
would expect. We don’t use latest and greatest stuff and fix it when it breaks: The time frame may
be 15 years yes, but this is a hundred year business. We experiment fast but don’t implement till it
works.
What is the impact of the ‘younging’ of the workforce?
This is the ‘bridge dilemma.’ The young have fantastic digital skills but no experience. Older folks
have the experience but don’t use an iPod! But there is no substitute for experience. Simulations
help people realize the consequences of bad decisions. If you blow up the plant in the simulator you
learn something. This may even be better than hanging out in Angola for a couple of years.
TW0808_2 Next Generation Production Surveillance - Tom Moroney, Shell
Shell’s deepwater Gulf of Mexico (GOM) asset portfolio is growing in size and complexity. More data has
to be processed routinely and efficiently. Much of the workforce is close to retirement and Shell is
mentoring a wave of young talent that is coming into the organization. One facet of this is ‘exception based’
surveillance workflows and tools feeding into a knowledge repository. This is allowing Shell to ‘routinize’
elements of its activity so as to avoid ‘firefighting.’ Shell’s old surveillance system in New Orleans was not
considered good enough. A new system was needed to offer centralized, consistent cross-asset surveillance
for the GOM and maybe Brazil and Alaska. The key to the new Production Operations Management Center
(POMC) is that it is ‘exception based,’ providing operating limits, drawdowns, performance curves for
topsides and subsea equipment. The idea is to understand conditions as they occur. The appropriate time
frame may be days, weeks or months. A virtual asset team can be constituted from POMC staff, operations
and process engineers which can apply ‘lean’ principles and consistent definitions across wells, facilities and
subsea. Technology in the POMC includes a portal, advanced alarm tool, workflow automation, dynamic
reporting and the knowledge repository. Alarm notifications pop up PI Historian trends and well test data for
analysis and ‘situational awareness’ – all managed in the workflow engine. Once the analysis is done, this is
handed off to engineering. After an intervention, the system captures new parameters and performs short
cycle optimization before issuing recommendations to operations. Upon which the surveillance cycle is
complete, captured and documented for later reference.
TW0808_3 Panel Session
Tony Edwards (BG) attended the EAGE workshop on Integrated Operations in Stresa last year and observed
that there was confusion as to what should be standardized, outsourced and automated. Edwards believes
standards should be limited to HSSE, integrity, procurement etc., but that well placement and other
‘creative’ processes should be approached differently. If you put too much in standardized box you can
inhibit creativity. Similar issues arise with rigid process workflows. There is an approach that says, ‘don’t
standardize but go for a continuous improvement approach, start with the status quo and improve’ Some
think the process never stops. Another aspect is that standardizing on how, rather than on what, may inhibit
innovation. These issues are particularly important if you have a diverse portfolio. One size will not fit all!
ExxonMobil – There is excitement about IT enablement and removing the mundane stuff so that people do
what they were trained to do. Exxon does internal videos for presentations to management. Everyone says
the new IT is wonderful. Exxon also advocates continuous improvement of its workflows and best practices.
TW0808_4 Workflows in digital oilfields – Anil Pande, Infosys
This talk concerns office automation rather than real time/process control workflows. Such workflows might
concern a workover, AFE, travel request etc. A workflow could be a Perl script for a particular task or a
more complex task. Workflow is the glue that holds all these together. Standardization is ‘desirable,’
allowing for standard processes that can be integrated to other work flows. There are no standard tools – the
following can be used to map out a workflow, Word, Excel, Power Point. But there is ‘no best practice’.
Work flow modeling building bricks include participant, activity, transition and swim-lane.
Q&A
Shell – Are there any standards for workflow?

Technology Watch Report 4 © 2008 The Data Room


Digital Energy 2008, Houston TW0808
4
We work with some products but these may not integrate well .
TW0808_5 Energy web services – James Sanders, IHS
With growing data volumes, companies have to ask, ‘Do you want it all in your data center?’ Web services
have matured into a key enabling technology for remote data access. Sanders enumerated ‘emerging’
solution patterns as follows …
1) Application project data management. Here an application has a big ‘get data button’. This
approach is suited to applications that rely on a local database and where there is traditionally a lot
of manual work building projects. Petra Direct Connect, Geographix and PowerTools now all
support direct data access.
2) Corporate data store manager – a central database a.k.a. ‘single source of truth.’ Merrick
Systems’ RIO for example automates data load from the web.
3) E&P portal integration – here both internal and external data sources are aggregated in a ‘one
stop shop.’ Pioneer and Schlumberger customized a DecisionPoint portal with data coming from
multiple web services.
4) Online map and globe visualization. One large E&P company has 1,000 instances of Google
Earth and 200 Google Earth Pro licenses. The technology is used in BP’s IDV hurricane
management system and CartaSite for digital oilfield asset management. CartaSite uses video
cameras on facilities and trucks so that operations can react to events, contacting the truck or facility
over VoIP.
Global E&P content is now available in GoogleEarth – from IHS. IHS web services-provided content
delivery is up from 2 to 27 million records per month in the last year. This is real!
Q&A
Oil IT Journal – Do you provide a web services API?
Yes that’s what I sell to developers.
Is this used by smaller companies?
Some companies use web services for 100% of data. About 110 companies are using this.
Chevron – Where are you in the REST vs Web Services debate?
We are in the Web Services SOAP RPC camp. We see these as a follow on from DCOM/CORBA.
REST is more appropriate for web documents.
You mean WSI - RPC is deprecated.
Yes...
Do you use UDDI?
No! It was just on the slide!
TW0808_6 Integrated Information Framework – David Haake, IBM
Haake provided a lot more ‘business case’ for IBM’s Integrated Information Framework for process and
manufacturing than details on the technology. The IIF came out of the Norwegian Integrated Operations and
TAIL. The IIF includes a Rational (IBM’s modeling tool) Reference Semantic Model (RSM) connecting
measurements through planning, scheduling etc. This is ‘not a data model’ but models equipment
interoperating with existing standards including OPC, ISO 15026 MIMOSA via an ‘enterprise namespace.’
Q&A
Chevron – What exactly is the technology platform? Proprietary Rational, OWL or what?
Rational is a product, the model is not proprietary.
Making decisions faster is all very well – but perhaps the aim should be to make the right decisions
faster. Is there any evidence that these systems enhance decision making?
Yes there is from Statoil which has leveraged the downstream science of predictive maintenance
and applied to the well.

4
It’s surprising that no mention was made of the Workflow Management Coalition - http://www.wfmc.org/ in this
context or the Web Services Business Process Execution Language - http://docs.oasis-open.org/wsbpel/2.0/OS/wsbpel-
v2.0-OS.pdf - now supported by Microsoft’s Windows Workflow Foundation and BizTalk.

Technology Watch Report 5 © 2008 The Data Room


Digital Energy 2008, Houston TW0808
TW0808_7 Keynote – High performance computing – Donna Crawford, LLNL
The Lawrence Livermore National Laboratory (LLNL) bought its first computer in 1953 with a 1,000 flop
bandwidth. Today the IBM BlueGene/L high performance computer (HPC) is N° 1 in the TOP500 charts at
½ a petaflop. This is used on some daunting tasks such as assuring the US nuclear stockpile’s security
without underground testing. In the 1990s the LLNL decided it needed 100 teraflops – which required
‘breaking out of Moore’s Law’ before 50% of its test engineers retired in 2004. This led to a roadmap for
Sequoia petascale and exascale computing. Examples include the modeling of 64 million atoms of
molybdenum collapsing under pressure. Other codes model molten metal – a 2 billion atom calculation
running on 131,000 processors for 8 days. Incidentally application development costs a whole lot more than
hardware. A petaflop is expected at the next TOP500 this June, but despite the headlines, the real problem
today is harnessing the theoretical performance. You actually get 220 teraflops maximum on a 600 teraflop
machine. LLNL has worked in oil and gas. An Aramco fluid flow simulation on the Ghawar field ran for 8
hours on a 100 core machine. Models like the SEG/EAGE salt model fully elastic simulation and the SEG
SEAM model will require a teraflop for sub 20Hz modeling and a petaflop for 50Hz models. Even more
power is needed for inversion for risk assessment. LLNL is getting into the energy business, offering its
‘Stochastic Engine’ for oilfield data integration and Bayesian inversion for risk assessment. Petascale will
bring qualitative change in science produced. A CD is available from the LLNL’s Industrial Partnerships
Office5.
Q&A
Halliburton – We heard earlier about demand-led technology in oil and gas but the government
seems to have an ‘if we build it they will come...’ attitude.
This is a sensitive issue. A Council on Competition study found that there was not sufficient demand
pull. A lot of industries are not buying HPC. We need to demonstrate technology that works.
They’re not asking for it.
Reservoir engineering also has tough problems to solve – why differentiate?
These are often harder problems – we know and push seismic.
Don Paul (Chevron) – The LLNL is only 15 miles from our San Ramon location – they are a
fantastic group to work with.
TW0808_8 Keynote - John Gibson, Paradigm
Gibson offered an entertaining and wide ranging talk that included the anecdote of Canadian GoldCorp
mineral explorer whose CEO transformed the company by putting its geological data on the web – following
the open source model for software development. This generated 110 new prospects for a modest $575,000
worth of prizes to amateur prospectors and turned GoldCorp’s market capitalization from $100 million to
$20 billion6. A great example of transformational IT and a distributed ‘workforce.’ Oils often regards their
data as ‘proprietary’ – rather than in terms of its potential. Another web innovator is Ely Lilly whose
www.innocentive.com site is used for problem solving in many industries – including oil and gas (most in
oil spill remediation).
TW0808_9 The ‘blue’ digital oilfield – Ashok Belani, Schlumberger
A decade ago, the industry was faced with issues such as compute power, connectivity and ‘hard to
integrate’ databases. Since then ‘IT has moved on - all of these are second nature7.’ Digital is a way of life.
Schlumberger now has a hundred or so operations support centers for drilling around world. These leverage
experts, accelerate learning and provide well placement optimization, drilling performance optimization.
Belani cited the new StimMap Live microseismic service for real time fracture propagation studies as an
example of a recent development. With the boom in deepwater exploration, WesternGeco’s Q-Marine’s rich
wide azimuth seismics is coming into its own. NVIDIA GPGPU also ran as used for reverse time migration.
Multi measurement (seismic plus magetotellurics plus gravity) is getting traction and another target for HPC
– Belani would like to do simultaneous inversion across all measurements for sub salt/basalt targets.
Following more plugs for Petrel, AVOCET IAM (used on a miscible WAG for ConocoPhillips) Belani
turned to ‘field wide integration.’ Contradicting Don Paul (see above), Belani described the oilfield as not an
aggressive adopter of change. Experimentation and early adoption is not a characteristic of oil and gas.

5
https://ipo.llnl.gov/.
6
This led to the invention of Wikinomics - http://www.usatoday.com/money/books/reviews/2007-01-02-
wikinomics_x.htm.
7
A case could be made for these problems remaining as major ‘issues’ today.

Technology Watch Report 6 © 2008 The Data Room


Digital Energy 2008, Houston TW0808
Schlumberger is to address this under a new ‘Blue Field’ banner. The Blue Field is a journey to continuous
improvement. Schlumberger’s differentiators are software, data management and a holistic approach to total
integration. The approach is ‘quite young.’ Blue field workflows include field wide, single well watcher,
LithoPro and more product placements thaw we could catch! Schlumberger has five Blue Field
implementations for Petrobras, with others in Canada, Australia and in Norway for Statoil.
Q&A
Who will make the money? Oils or service companies?
Service companies if we have anything to do with it! But there will be plenty of value for all – and
the outsourcing trend will continue.
TW0808_10 BP’s Azeri Field Optimizer – Sergi Sama, AspenTech
Operations are generally wary of model based optimization. We need new workflows to demonstrate its
usefulness. Available tools include modeling, simulation and metering. BP’s Azeri Field Optimizer’s (AFO)
architecture is built on a Hysys model and facility simulator coupled with well models (Invent/Prosper), a
well test database, the IP21 Historian and an Access AFO database. Design principles include the fact that
response time needs to be adequate. Users need to start before lunch and have the results by the time they
are back at their desks. AFO has made it easy to validate/tune and reconfigure as wells come on and off
stream. The asset-wide model/optimizer calculates wells, separation trains, compression and pumping trains
for onshore and offshore facilities. Data QC and checks are built in. The well model is tuned, and calibrated
against head curves and efficiency curves (equipment performance). Business workflows have been
‘canned’ e.g. first optimize oil volumes then commercial gas etc. AFO determined that production was
constrained by the ability of facilities to re inject gas due to gas compressor performance (gas dew point
issues). By lowering the CWP dehydration pressure, production went up by 3% (15,000 bbl/day). The
optimizer also recommended reducing the amount of flash gas and to lower the slug catcher pressure to
maximize export volumes. Beyond optimization, the AFO supports condition monitoring, ‘what if’ and
engineering studies. BP is now embedding the AFO into the work process of the subsidiary. The power of
the AFO lies in being able to juggle many parameters across many fields and see the results immediately.
Q&A
Invensys – are these steady state models?
Yes. But power users can do more in depth studies and check intermediate results to ensure that the
system is in steady state.
TW0808_11 Shell Brunei Champion field gas lift optimization – Ron Cramer, Shell Global
Solutions
Champion is a very complex operation. Early in the development of the field, in the 1970s, the Champion 2
platform was lost as it fell into the reservoir! Some reservoirs are quite near surface and Shell now has strain
gauges on the seabed to monitor subsidence. Gas lift is the primary means of production with injection into
260 wells on 29 platforms. For those who aren’t aware of gas lift operations, gas is injected down the casing
– where it lightens oil the oil on the its way back up the production string. The technique is amenable to
optimization and is now considered a classic brown field operation. Gas lift optimization (GLO) the old way
involved awkward setting of downhole valves in the well. A wireline intervention was required to change
settings and this could only be done occasionally. But valves may sand-out, wear, and need changing-out.
Moreover the traditional way meant there was no active control over gas injected. Shell’s solution is to use
two Shell Services’ FieldWare/Production Universe systems are used, one for monitoring and one for
optimization. In 2003, fluid control valves (FCV) and smart multi variable flow transmitters (MVFT) with
onboard diagnostics were installed as part of a gas lift improvement pilot. For Cramer, the key thing is ‘get
your instrumentation good and keep it good.’ Next comes the ironmongery in the form of 51 gas lift lines
deployed. Production Universe’s ‘virtual multi phase flowmeter’ gives estimated flow rates for oil gas water
for all wells all of the time. Shell could not afford multi phase flow meters in each well at around $500,000
each! Multi rate well tests were used to build a ‘data driven’ model – deemed ‘more sustainable’ than a
physical model. Well tests are also known and sustainable activity. Well models were summed to give
field/total production. These real time estimates were then used to optimize gas lift. A configurable objective
function sits on top of Production Universe (PU) – which can be set to maximize production or minimize
opex etc. Interactions (backpressure effects) between wells can be simulated – e.g. a big well can shut out a
small well. Again, this is modeled from ‘deliberately disturbed’ well tests as above. It is an art to design
these right – something of a learning process for Shell. Shell then had 51 optimization assemblies in PU and
was able to improve well monitoring reducing deferment. PU is not a control system – it provides ‘advisory’
set points when something changes. Today the loop is not closed because of lack of confidence and the fact

Technology Watch Report 7 © 2008 The Data Room


Digital Energy 2008, Houston TW0808
that operators are there anyhow. Even though the system is not automated, it is ‘miles better than the old
system’ when it might take weeks before optimization after a compressor went down. MVFTs show gas lift
GL injection exceeds the historical figure by 20-30%. This makes for better injection gas accounting, better
GOR and understanding of process. There is still much potential for improvement – especially in saving gas
(less gas, less compression.) The IPM suite from Petroleum Experts was used for model calibration. Gas
injection is now controlled frequently from surface with much less wireline operations. Remote valves
means less hazardous helicopter trips. The system has also also eliminated gas injection to closed in wells.
The business benefits include reduced gas lift fees (40% down on one platform) and a sustained 20%
production hike. Experience shows that generally too much gas is used. Currently, the potential for
production increase, which has been demonstrated, is limited by downhole valve orifice settings. These are
to be changed when an opportunity arises. The system has proved to be great value in field wide
optimization and will be rolled out to all Shell Brunei fields as opportunities arise.
Q&A
AspenTech – Do you perform short term reconciliation with physical metering?
There is a continuous reconciliation of fiscal metering with well measurements. In the North Sea,
Shell does hydrocarbon accounting with Production Universe.
TW0808_12 IT Innovations Panel Session
0808_12.1 Rick Nicholson, Energy Insights
Energy Insights does surveys of oil and gas CIOs and executives globally. Today’s top IT initiatives/areas of
investment – are 1) applications, 2) financial system (ERP) upgrades, 3) infrastructure (server upgrades, data
management, visualization). Information/analytics is the most active area for innovation. Looking forward
(out to 2010) companies report more of the same but add process improvement, content management and
workflow. It used to be that oil and gas led utilities in adoption of new technologies. In last couple of years
this has changed around.
0808_12.2 Mike Sternskey, Microsoft
Sternskey gave a big plug for Microsoft’s service oriented architecture i.e. BizTalk, Windows Workflow
Framework, SharePoint client. This has been successfully deployed by BP which collates information from
the Historian, seismic and more into SharePoint.
0808_12.3 David Shimbo, Oracle
The digital oilfield is a reality. Oracle has been briefing geophysicists on using semantic web technologies to
identify prospects.
0808_12.4 Katya Casey, BHPB
All the technology we need is available but we are faced with problems digesting innovation.
Interoperability is not the problem it used to be – for instance ArcSDE works well with Oracle Spatial. But
there are too many ‘standards’. Take SOA – everyone does their own company standards. We need
coordination down to operating systems, middleware etc. We have huge amounts of data, but petroleum
engineering is only now thinking about databases rather than Excel. Many data types need better handling.
TW0808_13 GIS based HSE portal – Vineet Lasrado, Infosys
Lasrado described a systems integration job done for ExxonMobil on an incident reporting and analysis
portal. This uses a central database of safety incidents (integrating alarms from external agencies into the
company database). This allows lifecycle incidents tracking from occurrence, through investigate,
recommend, apply procedures and check in to the document repository before finally closing out the
incident. KPIs track the whole process. The system leverages GIS, balanced scorecard, HSE dashboard,
incident reporting and workflow automation. SharePoint also ran.
TW0808_14 Rock mechanics modeling – Peter Conolly, Chevron
Generic finite element analysis (FEA) and mechanical earth models (MEM) workflows involve geometry,
material properties, preprocessor, geological history (basin modeling - kinematic restoration with
geomechanics). The shared earth model is composed of 1) mechanical 2) geological, 3) basin and 4) seismic
earth models. The whole thing is the ‘whole earth model.’ This paper deals with the MEM component as
used to mitigate geological non productive drilling time. Seismic inversion is used to derive mechanical
properties and numerical stress modeling and mechanical failure modeling for well bore design and pore
pressure analysis. Tools include the earth model, pre processor, FEA simulators (in-house and commercial –

Technology Watch Report 8 © 2008 The Data Room


Digital Energy 2008, Houston TW0808
although the latter have issues with data formats) and post processors. The problems lie in the gaps – getting
data/results from one tool to another – a ‘universal problem.’ On the hardware front, the previous problems
with symmetrical vs. distributed multi processing is no longer an issue although making software run on
distributed computing can be taxing. Modeling tools have evolved to where memory is not so much of an
issue. The FEA preprocessor provides a tetra/hexahedral mesh and ‘glues’ nearly any data to any simulator.
Example from Jura thrust structure - 2D FEA. As structure evolves (in simulator) the whole mesh distorts.
Tetrahedra are currently not well supported in models. Chevron is looking to move to ‘real’ computer aided
design (CAD) ‘NURBS8’ surfaces and an unstructured meshed.
Q&A
Folks don’t realize how important this is to integrated earth modeling. Simulators used to be based
on a fairly regular mesh. How do you update models when new data arrives?
The tools are there – although they are not used much in earth science.
Oil IT Journal – If you use ‘generic’ CAD over large areas like a basin, do you take account of
geodetics.
Yes this can be an issue.
TW0808_15 Panel discussion

Panel – Don Paul (Chevron), Washington Salles (Petrobras), Don Moore (OXY) and Steve Fortune (BP)
Moore – The oil and gas industry doesn’t get as much credit as it should for getting to where it is today. A
decade or so ago I was at a CERA conference where all these young folks from the technology side were
telling us ‘you oil and gas guys just don’t get it!’ Then the tech bubble burst. Today you should be
encouraged how fast the gap closed. And this with far less people.
Fortune – We are also encouraged by the take up of technology. A couple of years ago we were in pilots
but now these are ‘full scale, value generating deployments.’
Paul – The data management issue is a good news/bad news story. The problem is now accepted – even
though it might be unsolvable – we now know how to live with it.
Royal Strategies – how do you leverage holistic sensing of an asset to adapt to emerging risk?
Fortune – Risk involves integrity and safety. We use predictive analytics – for instance in our GOM hubs it
gets hard to manage each piece of equipment to the same level. So we try to put the monitoring equipment at
optimum locations9.
Moore – You have to keep operations operating – its about making our numbers! And it’s going to get
harder and harder to do. We need to fix pump jacks before they are down. Saudi Aramco is still putting a lot
of effort into efficient operations. Everyone has to make their reserve numbers or it will be very painful!
Paul – We also need to constantly feed the financial community which requires lots of information. They
now can see a 2% products shortfall. For Chevron, every barrel not produced must be bought. At
$130/barrel, this means direct financial consequence for every shortfall. Environmental liabilities are no
small deal. We are constantly observed by the government and NGOs. Hence the need to predict and
prevent.
OpenSpirit – Referring to Don Paul’s remark about unmanageable data – is this a Chevron specific
problem?

8
Non uniform rational basic splines.
9
Fortune mentioned GE’s turbine monitoring and perhaps SolArc.

Technology Watch Report 9 © 2008 The Data Room


Digital Energy 2008, Houston TW0808
Paul – Its not just about how hard it is to manage existing data – we need to adapt constantly to exponential
data volume growth.
Salles – It probably is not solvable – but we are working to reduce the problem.
Fortune – Information management is more important than digital oilfield technology. For instance our
advanced collaboration environment allowed us to spot a failing compressor rotor and fast track a repair. But
the absence of documentation and specifications cost us three days of downtime before a new rotor could be
machined. Real time is all very well, but you need information management as well. We now have a parallel
IM track in our field of the future program. It is a massive problem that requires a level of engineering input
that is not there. We have now re-introduced document controllers and data managers – roles that went out
in the low oil price days. This was a big piece missing from the organization that has been brought back in.
There is also a ‘process gap,’ keeping information up to date.
Moore – There are many pieces to puzzle. We are now good at gathering data and storage is now cheap, but
we still don’t manage it very well. A lot of technology is built for specific functions – we need to look
beyond point solutions. Otherwise we will remain data rich and information poor.
Sterneskey – Companies are driving to a standard architecture with initiatives like PRODML.
Salles – Petrobras’ approach is to blend in house solutions and service providers. We have developed an
integrated database (IDB) for geology and production data – this is being extended to include real time
digital oilfield data.
Moore – We have put a lot of effort into standardization. Not necessarily down to one application per task,
but not four! We have been successful across field operations and geoscience tools. You need to have the
same data/systems so people can move around and solve problems at a distance. Users see the value even if
its not perfect. Communities of practice are a great help.
Fortune – IT has been successful in standardization. But this can strangle businesses’ uptake of new
technology. So we prefer to look at core technology standards and when it makes sense, to look at business
processes. It is unusual that information comes from a single source – so we expose multiple data sources to
application. Business problems are synonymous with integration problems. Our open architecture provides
plug and play and workflow components. Domain specific applications can connect together. The integrated
architecture is the key.
What about the ‘greying’ workforce. What should the CIO be investing in?
Fortune – This is a big issue. In the next decade, 40% of BP’s workforce will retire. So we need to let
retirees go on contributing. Technology will help with collaborative technology such as that used on Halo10
on the Xbox Live. Knowledge capture and data integration also helps – alerting young engineers to
problems and suggesting solutions.
Paul – what will you tell your supervisor about what you learned from the conference ?
Salles – About model based prediction and services oriented architecture.
Fortune – Data centers visualization and integration – and about upping the game on optimization. The
next step of getting value ex optimization is hard to implement. This is where we are in BP with a big
change management program centered on taking digital oilfield to the next level.
Paul – I’ve been struck by the importance of clarity in business value. Cultural challenge in face of demand
growth. We are way out of the linear range and entering new territory. The technology will be there –
change management is the key.

10
http://en.wikipedia.org/wiki/Halo_(Series).

Technology Watch Report 10 © 2008 The Data Room


Digital Energy 2008, Houston TW0808
TW0808_16 Exhibitors
0808_16.1 CISCO ‘NERV’ Network Emergency Response Vehicle

Cisco’s Network Emergency Response Vehicle (NERV).


Cisco’s Network Emergency Response Vehicle (NERV) is a mobile communications and command center
for disaster management. The system provides instant voice, video and data communications. Cisco’s IPICS
technology allows disparate radio systems to communicate with each other via IP translation, police, who
are on one radio system, can talk with fire professionals who are on another radio system, who can talk with
the National Guard, who are on another radio system. TelePresence, video surveillance, Wi-Fi, satellite
communications, and IP telephony on-board. The NERV was used during last year’s San Diego fires to
patch fire and sheriff’s radio systems. The DSS Satellite dish brought in television news which was then
encoded to Windows media player for Sheriff’s PCs. More from
www.cisco.com/web/learning/le21/le34/fose/2008/post/docs/NERV_mktg.pdf.

Technology Watch Report 11 © 2008 The Data Room


Digital Energy 2008, Houston TW0808
0808_16.2 Software Innovation - Collaboration for OOs and EPCs (Chevron)

Software Innovation’s Coreworx document management and workflow administration11.


Software Innovation’s Coreworx provides engineering document management and collaboration for owner
operators and engineering prime contractors and suppliers. A customizable landing page displays role based
tasks, links to favourites, project information and news, etc. A tree view displays project related items and
an administrator’s document management system desktop (above) allows for system configuration. Chevron
has over 100,000 documents and up to 5,000 active workflows stored in a single Coreworx project covering
worldwide projects. Chevron’s use of Coreworx was the subject of a May 2008 FIATECH webinar
(http://www.fiatech.org/events/etarchive.html) on overcoming challenges to capital projects. More from
www.softinn.com.
0808_16.3 Credant Technologies – Mobile Guardian for securing sensitive data
Credant’s Mobile Guardian (CMG) provides centrally managed, policy-based mobile data security and
management solution for data on laptops, desktops, tablet PCs, PDAs, smartphones, iPods, USB sticks and
other portable storage devices. CMG encrypts data on mobile devices removable media throughout the
enterprise. The management console provides audit and reporting features for regulatory compliance. More
from www.credant.com.

11
Image courtesy Coreworx.

Technology Watch Report 12 © 2008 The Data Room


Digital Energy 2008, Houston TW0808
0808_16.4 EPSIS Real Time Assistant (ERA) in Chevron’s Master Schedule View

ERA Visual for maintenance planning on an on-shore oil field12.


EPSIS Real Time Assistant (ERA) is used in Chevron’s Master Schedule View (MS View). MS View plugs
in to Chevron’s Minerva data infrastructure to pull together work order data from real time sources. The
ERA Visual component provides visualization of key information from the field including terrain model, sea
surface, sea bottom, geological horizons and well trajectories. 3D icons show objects of interest such as
buildings, constructions, equipment, vehicles etc. These are connected to data sources to track vehicles and
vessels and sensors/alarms. Users can drill down for more information by clicking an icon. A plug-in
architecture means that the framework and can be integrated into existing IT infrastructures.

ERA Connect GUI for design and execution of collaborative workflows13.


SAIC and EPSIS have provided Chevron with collaboration software for meetings and a workflow data
base. When a meeting starts up, everything is there. ERA Connect pulls up applications such as Excel,
PowerPoint, video and domain specific applications. The collaboration tool is used to add in a participant’s
PC and share a workspace. Workflows and workspaces can call any application such as Google Earth,
Fekete.

12
Image courtesy EPSIS.
13
Image courtesy EPSIS.

Technology Watch Report 13 © 2008 The Data Room


Digital Energy 2008, Houston TW0808

ERA Connect supports cross application, cross location collaboration14.


The Microsoft development (written in C++) can also call Unix apps. Virtual teams can be created across
remote locations sharing all information on the screen and allowing for active participation. More from
www.epsis.no.
0808_16.5 Halliburton – AssetObserver, AssetConnect modeling and operating environment
The new version of Halliburton’s AssetObserver web-based operating environment is now based on the
IncuityEMI platform. AssetObserver allows production experts to access and integrate data from a range of
sources and monitor workflows and assets in real time. A companion AssetConnect product provides an
asset modeling environment for production engineering workflow optimization. More from
http://www.halliburton.com/ps/Default.aspx?navid=228&pageid=797.

AssetConnect in design mode15.


Landmark was also showing automated stimulation design workflows in DecisionSpace for Production
platform. The automated workflow uses Halliburton’s SigmaSolver application for frac design. More from
http://www.halliburton.com/ps/default.aspx?pageid=1224.

14
Image courtesy EPSIS.
15
Image courtesy Halliburton.

Technology Watch Report 14 © 2008 The Data Room


Digital Energy 2008, Houston TW0808
0808_16.6 HP – Advanced Virtual Collaboration for Oil and Gas ‘open standard’
HP has been working on BP’s collaborative environment – the 3D Canvas for drilling, training and
simulation. This leverages collaborative technology from the Qwaq (www.qwaq.com) – itself based on
academic open source development by the Croquet Consortium. (Also HALO conferencing tool ). HP plans
to release the toolset as the ‘Advanced Virtual Collaboration’ (AVC) open standard for oil industry. The
idea is to level the playing field for all stakeholders. Service companies will have avatars and ‘rooms’ to
which doors act as security gate for confidential discussions and AVC data rooms. Application functionality
is preserved. More from www.hp.com.
0808_16.7 IMPAC Systems search and knowledge management placemat
Impac Systems ‘placemat’ diagrams people, technology and processes involved in a knowledge
management deployment. Impact’s search system indexes and searches the data with concepts that are
defined as classifications and taxonomies via a rule builder. BP is rolling the system out internationally.
More from http://www.impacsystems.com/contentdocs/ISE%20Enterprise%20Search%20Placemat.pdf
0808_16.8 Infonic – Geo-Replicator – enterprise scale replication for Microsoft SharePoint

Infonic’s Geo-Replicator Microsoft Office SharePoint Server16.


Infonic’s Geo-Replicator plugs a gap in Microsoft Office SharePoint Server by providing remote and mobile
workers with access to enterprise content regardless of location and bandwidth. Geo-Replicator replicates
content to remote servers and synchronizes all file changes and updates. A server to laptop product
virtualizes SharePoint, files and enterprise content onto laptops for field workers even when a network
connection is not available. Geo-Replicator facilitates the movement of large files across ‘challenged’
networks. Infonic’s patented byte-level differencing technology, Epsilon sends content changes at the byte
level for ‘up to 99%’ compression and low latency. Geo-Replicator underpins Shell’s International Trading
and Shipping Company’s safety management system to synchronize information in compliance with the
International Safety Code. This allows for distributed update of frequently updated documents covering
normal operation, maintenance, and emergency situations. Shell’s safety documentation is maintained at its
London office, and pushed to vessels with Geo-Replicator. More from
www.infonic.com/product_geo_replicator.php.

16
Image courtesy Infonic.

Technology Watch Report 15 © 2008 The Data Room


Digital Energy 2008, Houston TW0808
0808_16.9 IOCOM – (InSors) Grid collaboration room software (Chevron)

IOCOM’s collaboration room software in action17.


IOCOM ( formerly InSors) supplies Chevron with collaboration room software. Real time visualization can
be used across low bandwidth connections to remote sites. The solution includes multipoint voice, video,
and data tools and ‘complex distributed work processes.’ The IOCOM Grid runs Windows across
conference rooms, command centers, laptops and tablet PCs. A single server supports up to 20 simultaneous
connections in multiple meetings for collaboration via VoIP/SIP and traditional video conferencing. More
from www.iocom.com/io/index.html.
0808_16.10 Optelligent Solutions – oilfield data mining and visualization

Optelligent’s data mining/oilfield data visualization18.


Optelligent Solutions’ proprietary data mining tools include OSViz for Oil Field Data Visualization for
analysis of production, injection, reserves and decline trends. The software or service solution can be
coupled with IHS or proprietary data sets which are presented spatially and with time series animation. RTA
(Probabilistic Regression Tree Analysis) helps discovery of relationships hidden in the data – displayed as
‘easy to understand’ regression trees. Fuzzy Screening helps identify analogs in databases, ranking them
according to user-defined fuzzy criteria which handle ‘imprecise, ambiguous, contradicting and uncertain
data.’ Packaged workflows are available for fluid flood optimization and automated opportunity screening

17
Image courtesy IOCOM.
18
Image courtesy Optelligent.

Technology Watch Report 16 © 2008 The Data Room


Digital Energy 2008, Houston TW0808
for identifying infill drilling locations and workover candidates. The tool was originally developed for
Canadian oil company Esprit Energy Trust (now part of Pengrowth) where it was used to optimize fluid
flood. More from www.optelligentsolutions.com.
0808_16.11 P2 Energy Solutions – Enterprise Land
P2 Energy Solutions has re-designed its Tobin LandSuite, now called Enterprise Land. The Microsoft .Net
re-write is claimed to offer a services oriented architecture of ‘loosely coupled business processes that can
be accessed and consumed independently.’

P2ES Enterprise Land DataMart19.


A land ‘DataMart’ decision support tool uses technology from Informatica and Hyperion to provide
reporting and analysis tools for drill down. DataMart’s data structure simplifies the Enterprise Land
transactional database for faster data access. Application program interfaces allow business users to
efficiently import and export data without needing technical resources. P2ES’ APIs are interface tables that
exist throughout the application suite. The tables have built in validation routines and editing capabilities.
APIs can be used for simple data imports and exports or can be used for massive data conversions that
happen as a result of big acquisitions. More from www.p2es.com.
0808_16.12 Schlumberger – Credentus enterprise credential manager
Schlumberger’s ‘Credentus’ enterprise credential manager provides ‘multifactor’ authentication, managing
passwords and devices including digital certificates, smart cards, one time passwords and biometrics. The
supervisory system replaces traditional passwords and integrates with third party access management
solutions. 500,000 users in the top ten oil and gas companies are claimed. More from www.slb.com.
0808_16.13 QuickWells – Smart well design and procurement system
QuickWells is developing a package for front end engineering design and procurement activity associated
with high end ‘smart wells’ with advanced completions. All components are linked to a SQL Server
database which can be optionally hosted. A framework designer provides an accurate schematic of the
completion and lets the design engineer insert objects such as packers. Engineering calculation modules are
available to check well integrity during design. An API allows third party additions to the framework. More
from www.quickwells.com.

19
Image courtesy P2 Energy Solutions

Technology Watch Report 17 © 2008 The Data Room


Digital Energy 2008, Houston TW0808

TW0808_17 The Data Room – Technology Watch subscription information


This report has been produced as part of The Data Room’s Technology Watch reporting service. For more
on this subscription-based service please visit the Technology Watch home page or email tw@oilit.com.

© June 2008
The Data Room
7 rue des Verrieres
F-92310 Sevres France
Tel (USA) 281 968 0752
Tel (UK) 020 7193 1489
Tel (France) +33 1 4623 9596
Fax +33 1 4623 0652
Technology Watch Home Page info@oilit.com

Technology Watch Report 18 © 2008 The Data Room