Global Sales Contact List

Contact   A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Warning! Smart Big Data Analytics People in the Room

David Holmes

David Holmes

CTO & Chief Industry Executive -Global Oil & Gas Program at EMC
As Chief Industry Executive for the Global Oil & Gas Program, David is responsible for developing EMC’s Oil and Gas upstream solutions and product positioning strategy in conjunction with the Brazil Research Center and Global CTO Organization. Works with partners and clients to identify oil and gas business needs and designs solution architectures to support these opportunities. David has served on a number of industry committees including the European ECIM organization and SPE’s “Petabytes in Asset Management.” He has delivered numerous technical papers at conferences around the world and holds a patent for his work on the remote visualization of geotechnical applications.
David Holmes
David Holmes

Latest posts by David Holmes (see all)

Recently, I attended the Society of Petroleum Engineers Forum event on Big Data Analytics in Dubai, UAE. Forum events are industry led and have no sponsorship, they bring together 50 thought leaders from vendors, oilfield service companies and oil companies to look at the challenges and opportunities related to a particular topic. It’s pretty exhausting being in a room full of smart people for four days, and my brain definitely needed the weekend to cool down.

But over four days of workshops and discussions, a clear theme was identified: The lack of an integrated approach to big data analytics. Companies complained of a lack of joined up thinking and of business stakeholders investing in bespoke point solutions that only increased the complexity and challenges of delivering future solutions. It was pretty cool to be able to talk holistically about a range of solutions that addressed infrastructure, data integration, data quality, data analytics, data persistence, the role of the cloud and the third platform as well as some top notch PAAS and agile development smarts. EMC has all of these, available (as is our want), either piece by piece or as a fully engineered solution wrapped up in the ribbon that is the EMC Federation Business Data Lake.

However, the implementation of an integrated big data analytics capability across the enterprise has consequences beyond those I had anticipated and at all levels of the business:

smart_people_in_room

  1. Strategically – One attendee talked of his frustration at the lack of consistent adoption of big data analytics to support portfolio management. A comprehensive approach would allow companies to dynamically manage their portfolio of assets supporting the regular review of business strategy based on changing market conditions. Optimizing portfolio management has an ROI running into the hundreds of millions if not billions of dollars. One speaker talked passionately about how “bias is the mortal enemy of upstream performance.” Big data analytics should help remove bias and support rational decision making.
  2. Operationally – Many companies are introducing big data analytics tools to address particular workflows or challenges. While these solutions might address particular high value problems, often they are not being implemented in a joined up way. Almost all of the attendees supported having a centralized big data analytics function with data engineers embedded into asset teams but working as part of a centralized group working on a common set of platforms.
  3. Tactically – There was quite a lot of talk as to how big data analytics could be commoditized to support smaller opportunities. One example given was that maybe you could save $40,000 a year through analyzing water purchasing contracts and linking this to your reservoir model. But that only makes sense if you could run a project to implement such a solution for less than $60K (assume a 100% ROI over 3 years and an 18 month payback). The only way you can support small projects is to have all of the infrastructure and resources in place already.

Of course a lot of talk at the event was around the oil price. But did this put people off looking at technology projects? Not really, as one person put it – “Oil companies of all sizes are facing an existential crisis, the company that is first to effectively leverage big data analytics across their enterprise will have a material competitive advantage over its competitors. Then everyone else will have to follow suit.”

And it’s cheaper too!

David Holmes

David Holmes

CTO & Chief Industry Executive -Global Oil & Gas Program at EMC
As Chief Industry Executive for the Global Oil & Gas Program, David is responsible for developing EMC’s Oil and Gas upstream solutions and product positioning strategy in conjunction with the Brazil Research Center and Global CTO Organization. Works with partners and clients to identify oil and gas business needs and designs solution architectures to support these opportunities. David has served on a number of industry committees including the European ECIM organization and SPE’s “Petabytes in Asset Management.” He has delivered numerous technical papers at conferences around the world and holds a patent for his work on the remote visualization of geotechnical applications.
David Holmes
David Holmes

Latest posts by David Holmes (see all)

Maximize G&G Application Performance & Lower TCO Simultaneously

 

As oil and gas companies wrestle with delivering dramatic reductions in their operating and capital budgets, whilst maintaining a razor sharp focus on safe and efficient operations, many people are asking how or even if IT can support companies in these challenging times. But whatever answer we might come up with, the starting point in today’s climate must be to deliver material reductions in cost. Oh and you had better remember that at the last count the industry has 250,000 less employees, so make sure you figure out how to transform user productivity.

So how can we drive up user productivity and reduce costs simultaneously? Recently we have been doing a lot of engineering work on next generation geoscience systems. Interpretation and modelling applications tend to be heavily workstation-oriented where individual users are equipped with expensive self-contained computing resources. When data is required, a low-latency high-bandwidth transfer from network storage to the workstation is needed to achieve useful levels of performance and productivity.

However as data velocity and volume increase, sustaining workstation-based applications is a real challenge. Individual workstations need ever increasing levels of computing power, memory and storage to cope, and delivering the required high-bandwidth low-latency I/O becomes eye-wateringly expensive.

Wstn Apprch

This rather expensive exercise places IT under constant pressure to deliver computing resources to meet the largest expected workload at the time, which means that workstations are either often over-specified compared to average expected workloads or under-specified, leading to user frustration and inefficient working practices. The overall result is that computing power, storage and memory cannot be correctly balanced to workloads, since the high-end resources are not always needed and cannot be shared.

pic2

There is also a penalty in workload throughput. We have observed cases where geoscientists need to wait as long as 30 minutes to load projects into their applications. This has a negative impact on team productivity and agility, particularly when seismic data forms a critical part of the workflow.

The core strategy to address these challenges is the centralization of computational resources (both CPU and GPU) inside the data center using Converged Infrastructure. Essentially, this approach takes the enormous amount of computational power that is out on workstations and relocates it back into the data center. There are a couple of key reasons why this is beneficial:

  1. Operational Efficiency – the workstation-oriented approach leaves much of the computing resource underutilized and requires an expensive IT support mechanism. Having a shared set of central resources enables better provisioning of appropriate resources to users with thin-client devices – far more with less
  2. Efficient Network Utilization – by co-locating computation resources with the data, we remove the need to shift large volumes of data over networks to individual workstations, given wider easier access to a rich data set – again far more with less

VDI Petrotech App

TCO studies have shown that EMC’s Petrotechnical Appliance – a Converged Infrastructure solution – can achieve cost savings in excess of 35% by migrating from distributed workstation computing to centralized computing delivered through VDI. Simultaneously, end-users are able to experience a more consistent delivery of computing resources to match individual workload demands without needing expensive workstation upgrades – so higher end user productivity and lower costs!

 

SEG 2015 - EMC Article Submission - TCO image
The EMC Petrotechnical Appliance is based on the industry-leading Vblock® Converged Infrastructure from VCE. It is being increasingly adopted by oil & gas companies and is recommended by leading Oil Field Services companies as a key ingredient for optimizing geoscience operations, particularly in the current oil & gas economic climate.

Analyze this – big data for Oil & Gas 2016 economics

Julian Alfred

Julian Alfred

Sales Enablement - Global Oil & Gas Program at EMC
During the 3 years developing joint solutions within EMC's Global Alliances group, Julian worked on a Seismic Data Management solution for Oil & Gas. Leveraging this experience, he is now responsible for Sales Enablement within EMC's Global Oil & Gas vertical to support revenue-generating field initiatives focused on Analytics, Big Data, Security and Business Resilience in Energy.
Julian Alfred

Latest posts by Julian Alfred (see all)

The concept of a Digital Oil Field is nothing new. Yet the Oil & Gas industry still struggles to leverage broad-scale big data & analytics in a way that makes it as mainstream and clearly understood as interpretation and modelling processes. Granted there has been some success in areas such as predictive maintenance and drilling optimization, but very little in sustained step-change improvements that have redefined the way production is planned and executed – especially when we look back at 2015 Oil & Gas economics and the continued challenge in 2016 and beyond.

One of the key obstacles that make leveraging the power of analytics difficult is the sheer volume and variety of data available to potentially fuel workflows:

  • How do I consolidate data from multiple systems into a single secure environment (a Data Lake) that scales easily and cost-effectively so it can be more easily managed?
  • How can I create models across heterogeneous data sets so that benefits can be seen not only in one section of the Hydrocarbon Value Chain, but across multiple domains to produce exponential benefits?
  • How can I execute analytical workloads at a velocity that is sympathetic to data latency issues so that business planning fundamentals have a larger reliable predictive component?

The good news is that advancements in data storage and computing power have offered solutions to individual technical infrastructure challenges, but putting them together to form a smooth running analytics platform still requires specialist skills.

However, perhaps even more important than the technology challenge is the figuring out of what we should perform meaningful analytics on.

  • Which areas of the business should I look to optimize that will make significant operational impact where savings or increases in value are measured in the tens or hundreds of millions of dollars?

The pessimism around big data and analytics in general comes through in the August 2015 Gartner report ‘The Demise of Big Data, Its Lessons and the State of Things to Come’, in which the following Strategic Planning Assumption is made:

“Through 2018, 90% of deployed data lakes will be useless as they are overwhelmed with information assets captured for uncertain use cases”.

 

Source: Gartner report – ‘The Demise of Big Data, Its Lessons and the State of Things to Come’ Published: 19 August 2015

So what is the answer?

We certainly cannot give up on big data & analytics any more than we should give up on making exploration and production more efficient. We need to claw back margins lost to low oil price economics, particularly in the Upstream segment of Oil & Gas.

I believe we have found at EMC what appears to be a solid approach that enables the rapid organized processing of data to extract timely actionable insight to drive step-changing efficiencies. It is used to appraise, design and implement digital oilfield and big data & analytics projects to improve productivity and reduce well and facility downtime. We have identified that there are 5 key ingredients for success:

ADDED - analytics

But the bit that takes you over the wall is being able to blend the 5 ingredients effectively, so that one does not get in the way of another – for example, you could have a great analytics engine, but if the data management platform that will feed it scales poorly or cannot deliver quality-checked data rapidly, the entire analytics process is weakened. If you have access to external intelligence but cannot integrate the data in a way that enriches your existing data sets, then the true value of that industry knowledge is never realized.

We believe we’ve figured out a way to do that blending in a repeatable methodology driven through our Oil & Gas Big Data Vision Workshop. In a recent case spanning 10,000 wells, analytics using EMC solutions showed a potential increase in oil production of 8-14%, and decreases in completion costs of more than 5%. In another case, a large Independent was able to uncover that if they spent an extra $40M on unconventional drilling operations, they would see a $200M return over 3 years.

Want to learn more about the Big Data Vision Workshop? Just send an email to julian.alfred@emc.com.

HDFS Everywhere!

David Holmes

David Holmes

CTO & Chief Industry Executive -Global Oil & Gas Program at EMC
As Chief Industry Executive for the Global Oil & Gas Program, David is responsible for developing EMC’s Oil and Gas upstream solutions and product positioning strategy in conjunction with the Brazil Research Center and Global CTO Organization. Works with partners and clients to identify oil and gas business needs and designs solution architectures to support these opportunities. David has served on a number of industry committees including the European ECIM organization and SPE’s “Petabytes in Asset Management.” He has delivered numerous technical papers at conferences around the world and holds a patent for his work on the remote visualization of geotechnical applications.
David Holmes
David Holmes

Latest posts by David Holmes (see all)

Challenges

Increasingly oil and gas companies are looking to big data and analytics to provide a new approach to answering some of their hardest questions. One of the foundation components of this is to use the HaDoop File System (HDFS). HDFS is a unifying persistence layer for many of the big data and analytical tools on the market (Pivotal’s and other vendors). Whilst many companies have looked to Hadoop clusters to provide both storage and compute, EMC has recognized that there are a number of challenges associated with this approach including:

  1. If storage sits inside a Hadoop cluster, there must be a (potentially time consuming) ETL task to get data from where it sits into the cluster. As soon as the ETL process is complete, the data is out of sync.
  2. In order to increase storage it is also necessary to increase compute. This can create an imbalance between compute and storage capacity. This can further be exacerbated by the need to buy Hadoop distribution licenses for each node.
  3. Because Hadoop HDFS is designed to run on cheap commodity hardware, it provides “eventual consistency” of data, and ensures availability by maintaining three (or more) copies of all data. This leads to much greater raw storage requirements than traditional storage environments (<33% usable capacity).
  4. All metadata requests to a Hadoop-HDFS cluster must be directed to a single NameNode. Although it is possible to configure a standby NameNode in Active/Passive mode, the failover process is weak and recovery is not straightforward.

Solutions

To address these challenges, EMC has developed three storage solutions that resolve these issues (with a fourth coming soon):

  • EMC Isilon provides high performance HDFS storage as an additional protocol. This means that any data copied to the Isilon cluster using CIFS or NFS can be made available through HDFS. The storage is much more efficient as data protection is achieved using Isilon’s built in protection so only one copy of each data file (plus parity) is created. In addition, each Isilon node runs as both a NameNode and a DataNode so there is much higher performance, availability and no single point of failure.
  • EMC Elastic Cloud Storage (ECS) provides a very scalable geo-distributed object store which fully supports HDFS. ECS is available either as an appliance (with low cost EMC commodity hardware) or as software (in a ‘bring your own tin’ model). ECS is highly compelling for companies looking to build vast geo-distributed object data stores and also for archiving workflows (especially for seismic acquisition data).
  • EMC ViPR Data Services (VDS) enables commodity and other vendor storage systems to be exposed using the HDFS protocol. So for storage systems that do not natively support HDFS, you can use VDS to layer on top of this storage and make the data available via HDFS.

Benefits

Using these technologies, EMC makes it very easy to deliver on an ‘HDFS Anywhere’ strategy, but what are the compelling reasons for doing this?

  1. By making the entire multi-vendor storage real estate available through HDFS, big data and analytical tools can be layered on top of the enterprise persistence layer allowing in-place analytics without having to perform any ETL tasks. This capability delivers cost reduction, reduced cycle times and increased productivity.
  2. As companies seek to deploy the new generation of cloud native applications, it is essential (particularly in oil and gas) to be able to have an integrated environment for old and new applications sitting on top of common persistence layers. This is an essential characteristic of contemporary IT systems as companies look to embrace Bi-modal IT strategies.

Summary

At EMC we are increasingly hearing from oil and gas companies that to achieve their efficiency targets and cost reductions, they need a concise roadmap to enable them to consolidate their legacy applications with an environment that supports and embraces the next generation of mobile, big data analytical apps. HDFS Everywhere is one element of the strategy to achieve this.

For many oil companies, the ability to run big data analytics against all their structured, semi-structured and unstructured data is compelling. Removing the necessity to carry out complex ETL tasks and the inevitable analytical latency enables analytics use cases and gives legacy vendors an easy roadmap to start migrating their applications to the 3rd Platform.

PS If you’d like to know more, swing by our booth #2511 at SEG in New Orleans (18-21st October 2015).

 

Are You Ready for Cloud Control?

Gonzalo Merchan

Gonzalo Merchan

Gonzalo Merchan, IIG Director for the Energy sector, is based in California, United States. Gonzalo brings over 25 years experience in the Software Enterprise solutions arena including 10 years with EMC Documentum, 5 years with BEA, and over 15 years at senior level positions at IBM and Fluor. As part of the IIG Solutions Strategy Group, Gonzalo is responsible for driving the EMC IIG solution strategy in the energy vertical, focusing primarily in bringing industry solutions to market. Gonzalo also covers the EMC Energy Partner ecosystem and the complementary solutions they bring to market, use cases with major energy companies globally, and key initiatives such as Advisory Councils. Gonzalo has two sons and two grand kids living in the Southern California area. Gonzalo enjoys time with family and travel to locations with sun, sand, and especially with good food.

[show_avatar email=gonzalo.merchan@emc.com align=left avatar_size=30]The worldwide demand for energy, increased competition for capital, and an aging infrastructure, are driving companies in the Energy, Utilities and Natural Resources sectors to search for ways to better manage complex construction projects, boost production, and increase the safety and profitability of refineries, mines, power plants and utility networks.

But managing large volumes of engineering-related content throughout the business life cycle – from exploration and production to facility operation and decommissioning – can create unintentional delays in project completion, and increases costs and risks. An often overlooked aspect of improving the management of engineering projects, plants and facilities, is better content management. Companies that truly want to impact the bottom line need to consider how better content management using industry-specific tools and templates can reduce risk, improve productivity, eliminate costs, and facilitate regulatory compliance. (more…)

Technology and Business Alignment in E&P

Tim Voyt

Tim Voyt

Mr. Timothy Voyt has more than 20 years of experience in providing technology and services to the Energy industry, both domestically and internationally. Mr. Voyt joined EMC in April of 2005 as the Oil and Gas Director and has global responsibility for all aspects of the EMC’s Oil and Gas Vertical Program. Prior to joining EMC, Mr. Voyt served as Executive Vice President of Operations for Tobin International where he had operational accountability for all of Tobin's Data Products, Software Products, Services, and International Operating Divisions. Mr. Voyt also spent more than 9 years with Landmark Graphics building and managing Landmark’s pre- and post sales services businesses within North America, Europe, Africa, and Russia.
Tim Voyt

Latest posts by Tim Voyt (see all)

[show_avatar email=tim.voyt@emc.com align=left avatar_size=30]

One of the best parts of my job is that I get to work with E&P customers and partners in who are working to solve really interesting business challenges. What’s even more interesting to me is that more and more of these business challenges have an information technology component that will positively or negatively impact the overall efficiency and effectiveness of the ultimate solution. The intersection and (more often than not) inter-dependence of technology within business operations provides fertile ground for technology companies such as EMC to engage with our clients in line of business discussions rather than just traditional IT dialogues. As such, our customers and partners are looking to EMC to take an active role in innovative research and development projects that incorporate business workflows, process improvement efforts, application development, underlying infrastructure stacks, and data themes. It is good to see IT professionals having a “seat at the table” and being able to contribute real value when it comes to solving business challenges. The realization that IT can help create a business advantage rather than (more…)

What Will Utilities Do With Massive Volumes of New Data?

Daniel Pearl
Dan Pearl is a technical subject matter expert within EMC's utility industry business unit. He is responsible for analyzing Utility Companies' business challenges and translating them into proven technology solutions that increase performance, reduce risk and accelerate time to market. Dan has worked extensively with leading Smart Grid and AMI vendors to build relevant EMC solutions around meter data management, cyber/physical security, and grid intelligence. He also has prior experience within EMC building solutions in the Oil & Gas sector.
Daniel Pearl

Latest posts by Daniel Pearl (see all)

[show_avatar email=daniel.pearl@emc.com align=left avatar_size=30]“The Future belongs to the companies who figure out how to collect and use data successfully.”
~ An O’Reilly Radar Report: What is Data Science?

The analytics market is enormous; it is estimated at around $70 billion today, growing at an astounding 14-20% per year.1  And everyone’s getting in, from the technology heavyweights to the warehousing specialists to the visualization tool vendors to industry-specific companies across all sectors.

The utility industry is no different. Energy Central’s Utility Analytics Institute forecasts that utilities worldwide will spend over $2 billion2 and Pike Research forecasts over $4 billion annually on analytics by 2015, with a 65% compound annual growth rate (CAGR) from 2010.3

Why? (more…)

Private Cloud for Upstream Oil and Gas – Realizing the PetroCloud

Larry Kaufman

Larry Kaufman

Larry Kaufman, Chief Industry Architect for EMC’s Global Energy Program, is responsible for developing and delivering complex, integrated solutions based on EMC and partner technologies that solve challenging business problems and workflows within the Oil and Gas Industry. Larry began his career in Oil and Gas as an open-hole wireline engineer. Larry also spent 16 years with Landmark Graphics and Halliburton in various sales, technology development and management roles as Landmark grew from 60 employees to over 2000 before being acquired by Halliburton. Larry has a Bachelors of Science in Electrical Engineering and has over 24 years of experience developing and deploying information technology solutions serving the upstream oil and gas industry. Outside of work Larry enjoys wood working and spending time at the lake with family and friends.

[show_avatar email=larry.kaufman@emc.com align=left avatar_size=30]It’s hard to believe that just 25 years ago seismic interpretation was a manual process.  Rather than workstations, provisioning a geoscientist meant providing a solid drafting table, good light, plenty of colored pencils, and paper weights. The process of developing a prospect could take months or even years.  The advent of computer-based systems was a boon to the industry and has drastically reduced workflow times and the risk of drilling dry holes.  The infrastructure evolved from a large, expensive turnkey system shared by many geoscientists, to large, isolated, individual workstations.  For data backup, disk was expensive and tape ubiquitous.

The client/server model coupled with shared storage decreased costs, reduced the use of tape, and streamlined access to data.  Thin client technology, which puts minimal hardware on the desktop and relies on the computing power of the server, was yet another innovation used to reduce costs.  One could argue that today we have come back to thick clients with “fat” (large memory, multiple CPUs) Windows and Linux based workstation applications.

A new IT revolution is upon us – the shift towards cloud computing. The private cloud, deployed on (more…)

Energy & Engineering Day at Momentum Europe 2011

Gonzalo Merchan

Gonzalo Merchan

Gonzalo Merchan, IIG Director for the Energy sector, is based in California, United States. Gonzalo brings over 25 years experience in the Software Enterprise solutions arena including 10 years with EMC Documentum, 5 years with BEA, and over 15 years at senior level positions at IBM and Fluor. As part of the IIG Solutions Strategy Group, Gonzalo is responsible for driving the EMC IIG solution strategy in the energy vertical, focusing primarily in bringing industry solutions to market. Gonzalo also covers the EMC Energy Partner ecosystem and the complementary solutions they bring to market, use cases with major energy companies globally, and key initiatives such as Advisory Councils. Gonzalo has two sons and two grand kids living in the Southern California area. Gonzalo enjoys time with family and travel to locations with sun, sand, and especially with good food.

[show_avatar email=gonzalo.merchan@emc.com align=left avatar_size=30]On November 1st, EMC will be hosting an Energy & Engineering Day at Momentum Europe 2011 in Berlin.

For Momentum Energy & Engineering attendees, EMC has scheduled a full day agenda highlighting how EMC and Partners are transforming information management to optimize critical business processes, improve engineering and construction projects and  ensure Environmental, Health and Safety (EHS) compliance.

The agenda includes:

  • An Energy & Engineering Annual Roundtable, facilitated by Roberta Bigliani, Research Director IDC Energy Insights for EMEA. This roundtable is intended to provide attendees with an opportunity to share best practices, review current market trends, become part of EMC’s Energy & Engineering community and influence the IIG Solution development roadmap.
  • Customer presentations including how Saudi Aramco is addressing structured and unstructured Information management by implementation of Digital Oil Field solutions and best practices for Exploration & Production. Also, Saipem S.p.A. sharing how they are addressing common business challenges to provide a simple, flexible, yet standardized document archiving and management model for all enterprise business units
  • Demonstrations and presentations from key EMC IIG Energy and engineering Partners that are developing complementary solutions running on the EMC IIG platform

There will also be (more…)

Spills, Meltdowns and Environmental Remediation: The ROI of Being Prepared when Litigation Strikes

Heidi Maher, Esq.

Heidi Maher, Esq.

Heidi Maher, Esq. Principal, eDiscovery and Compliance Legal Team Heidi Maher is an eDiscovery advisor in EMC’s Compliance & eDiscovery Practice where she leverages her legal experience along with EMC’s unique technology to help organizations address challenges related to ediscovery, compliance, and records management. By serving as a liaison between legal, IT, and other key business units, Ms. Maher helps organizations implement internal procedures and technology solutions that minimize the risk and expense associated with compliance and e-discovery. She has drafted discovery readiness plans for multinationals and advised corporations and law firms on privacy laws and best practices for international data transfers. Ms. Maher has conducted numerous CLE courses, webinars, workshops, industry conference presentations, as well authored articles on e-discovery and compliance in publications such as Digital Discovery & Electronic Evidence. She has been a member of Working Groups 1 and 6 of The Sedona Conference, a well-known e-discovery think tank, and was a project leader for the Electronic Discovery Model (EDRM). Ms. Maher is also a frequent contributor to EMC’s eDiscovery blog at www.kazeon.com/blog. Prior to EMC, Ms. Maher gained extensive litigation and technology experience as a legal consultant with RenewData Corp. where she helped educate lawyers and litigation professionals in corporations and law firms about technology and its role in litigation. Prior to that, she was a felony prosecutor, Assistant State Attorney General, and an attorney in private practice working on complex multi-million-dollar class-action and mass-tort litigation at the largest law firm in Austin, Texas. Ms. Maher received her J.D. from Baylor School of Law and her B.S. from the University of Texas at Austin. She is licensed to practice law in the Eastern and Western District Courts of Texas as well the Fifth Circuit Court of Appeals. Recent Articles and Blogs: Categorizing eDiscovery: A Practical Framework for Managing Your Information Legal Hold Guidelines for every Legal Department RULE 502: Friend or Foe? Money, Greed, Bribery & Corruption: the Cost of International Business??? eDiscovery StraightTalk “Is Forensic Collection Mandatory for All Civil Litigation?” – Issue 7 Internal Investigations drive eDiscovery Activity eDiscovery StraightTalk with Heidi Maher, Esq. – Issue 2 The X-Files: Issues Surrounding Exotic Forms of Electronically Stored Information. - Digital Discovery & e-Evidence, Bureau of National Affairs (co-author) E‐Discovery From Across the Pond: Data Transfers from the European Union to the U.S. ‐ Digital Discovery & e-Evidence, Bureau of National Affairs (co-author)

[show_avatar email=heidi.maher@emc.com align=left avatar_size=30]IQPC’s well organized and well attended eDiscovery for Oil and Gas Seminar is September 26-27 in Houston.  Thought leaders from, among others, Hess, Anadarko, BP, TransCanada and Valero, will be on hand to speak to the unique and not so unique eDiscovery challenges facing the Oil and Gas sector.  Recent record profits and catastrophic events have put the industry in the spotlight not just for lawsuits but also government investigations and regulatory oversight.

Organizations already have difficulties responding in a timely and cost efficient manner to eDiscovery requests.  The industry being inherently global in nature has the added challenge of determining how best to bring information back from certain countries. Many foreign countries, especially those in the European Union have blocking statutes and other privacy laws that prohibit the transfer of data to the United States. The topic I am speaking about is (more…)