Back

What's Current, What's Coming? ZECO 2012 Provided Answers

October 18, 2012, VANCOUVER, B.C., Canada -

By James Straker

October 18, 2012 – Richmond, B.C. – Vancouver put on its best face earlier this month with unseasonably sunny, warm weather but that wasn’t the only thing Dr. Zak El-Ramly had to be proud of as he welcomed attendees to ZE’s annual conference and user forum. A successful year that saw the signing of new world class clients and more awards for a great product and outstanding customer support were a few of the achievements the founder and CEO of ZE PowerGroup Inc. (ZE) could look back on with pride.

This year’s theme — Staying Ahead of the Curve: Trends and Challenges in Energy and Commodities Data — put forth several questions:

  • What are the changes affecting data in the market?
  • What are strategies for ensuring information has meaning and usefulness?
  • Why are curves important?
  • How important is systems integration?
  • Will new regulations enable or be a detriment to business?

Or, to sum it up and bring it closer to home: Should companies be re-thinking core business and operational functions to navigate the complexity of changes affecting markets and data? After all, it is about making the right decisions that will grow business. Geopolitical shifts, economic uncertainties, and supply and demand volatility have always been determinants in decision making, but increasingly they are becoming a consubstantial complexity that challenges even the most astute analysts. Doing the analysis — staying ahead of the curve — means that routine operations of collecting and processing data need to done with even more data, and accelerated and optimized processes.

This year’s conference agenda was a line-up of knowledgeable speakers presenting topics that should evoke discussion in boardrooms of every organisation that relies on data. And, if discussions about these topics are not already happening in some companies, they will be soon, because one point the conference made with certainty is that change is occurring now, and rapidly. Aiman El-Ramly, Chief Strategy Officer for ZE, and Bruce Colquhoun, Manager of Business Development for ZE; told an informative, and sometimes entertaining, back story about how the global market for energy and commodities data evolved and became so complex. Along with that history they described how new technologies, financial instruments, regulation and market players are shifting the market at an unprecedented rate.
Technology has become a critical asset for organisations wanting to not only remain competitive but build business processes and IT systems that will, long into the future, answer the question, “Can we do the analysis?” ZE, as a leading data management software development company, is of course interested in how technology, and not only its technology, will add value and can be used to facilitate decision making and increase profitability.
Software is one component that is, if you will, tangible, in any mix of factors influencing outcomes. You can get your hands on it and set it up to collect data, organize data, analyze data and display results however your organisation needs them to be. And, it can do this easily with consistent and predictable functions.
If only being able to determine where the economy will end up were so easy. But really, how important is this technology in the decision making process?

Every day, there are 2.5 quintillion (2.5 x 1018) bytes of data created. (IBM)

Dave Elpers, Account Manager for IIR Energy and a conference speaker, said that organisations spend up to 65% of their data processing time on collecting and organizing data and only 35% of the time analysing it. And, we can anticipate an even greater imbalance with the growth of big data. Big data has been with us now for a few years and is defined not strictly in terms of bytes but also in terms of whether traditional means can collect, analyse, store and retrieve data within an acceptable period of time. Volume, velocity and variety (3V’s) are generally accepted parameters for describing big data sets. Big data is any combination of high-volume, high-velocity (the speed of capture and distribution) and high-variety; and, instead of a daunting challenge, companies should regard it positively for its potential to provide more insightful analysis. Elpers’s response to whether the so-called big data bubble will burst was, “No, it will keep getting bigger and bigger.” He went on to say that it is important for companies to determine what is relevant, “What to record; what to ignore. How do you sift through it and make sense of it?”

To date, making sense of it means manual processes and multi-system technology within organisations that has, in many cases, led to data silos and disconnect between different functional areas. Event speakers and panelists emphasised the importance of building infrastructures that will handle big data’s 3V’s. Companies need to consider whether their existing systems are scalable, enterprise wide and well integrated — or whether they should be implementing new approaches. With high-volume data, distribution of information, especially for new regulation compliance purposes, will require more than ever a re-think of “how do you sift through it?” Systems that offer sophisticated metadata capacity will enable companies to organize and retrieve data and information easily, another important consideration for regulatory compliance.
New regulations by the U.S. Government to enforce transparency and accountability will affect a significant change to current business processes. The Dodd-Frank Wall Street Reform and Consumer Protection Act, commonly referred to as Dodd-Frank, became law in July 2010 and granted the Commodity Futures Trading Commission (CFTC) with regulatory guidance over swap transactions. James Yu, a partner with Houston based Willis Group Consulting, described for attendees the key regulations of Dodd-Frank, important dates along its roll out time line, and what companies should be doing now to be ready for reporting requirements.

237 of 398 proposed Dodd-Frank regulations have been passed so far.

Wading through hundreds of new rules being enacted by Dodd-Frank is daunting enough but the re-definition of business processes can be handled with the least cost impact and organisational shake-up if the right questions are asked now. Not the least of which is how an organisation’s existing technology will be affected. Yu pointed out that complex reporting that once took a week to complete will, under the Act, have to complete in as little as 30 minutes.

Dodd-Frank will be a burden for companies unable to adapt to its close to real-time reporting requirements. Ajay Batra, a Director for Opportune and conference speaker, told the audience that process automation is the key. And he said that this applies not only to mandated reporting requirements but across entire enterprise operations, not the least of which is risk management. Richard Leonard, a ZE Business Account Manager and IT Analyst, underlined that belief by stating that enterprise data management is essential for enterprise risk management if organisations want stability, and success in business.

Data silos can be problematic if companies haven’t defined overall information architecture that integrates all of its technology platforms.

“As complexity increases, managing systems becomes problematic,” said Batra. Collection, then processing and finally distribution are the major areas of a data process flow and how this process is integrated and executed is what differentiates efficient from non-efficient practices and operations. Batra suggests that companies must first define their processes. Then, efficiency begins upstream during data collection. There, data quality, validation and description (assigning metadata, attributes and observations) is critical. This is followed by data processing that needs to be consolidated within a central data warehouse, thus avoiding development of data silos. Silos typically come about when companies, over time, set up different applications to handle various projects. Applications then become part of a company’s operational framework but problems arise if they are unable to share common types of information. This is often a result of not having in the first place overall information architecture that integrates all of a company’s technology platforms.

Defining a blueprint that ties everything together means incorporating an enterprise data management application (or suite of applications) within an enterprise risk management framework. A data management application provides the tools for collecting, processing and distributing data and data analysis, as well as integrating all business functions. Distribution is typically in the form of forward curves, which have grown to become essential elements in buying and selling of energy and commodities. A panel that included Alexey Melekhin, Director of Analytics for ZE; Vlad Alexeyev, a Senior Analyst at ZE; Jay Pruett, Head of Business Development for Argus Media; and Randy Wilson, Managing Director for Platts talked about why the importance of curves cannot be overstated, why they are growing in scope, and how they are becoming more and more complex. “Forward curve vendors are challenged by ensuring that their products align with fair value accounting guidance in addition to meeting a growing demand for second order curves such as market implied correlations and volatilities,” said  Wilson.
Forward curves have risen from relative obscurity in the last ten years to being one of the most important determinants for companies that carry out trading activities. Pruett observed that forward curves are essential for a market to develop and form the basis for volatility analysis. And, increasing curve complexity warrants a move from spreadsheet based analysis to applications that can automate curve building procedures. Alexey Melekhin and Vlad Alexeyev described how curve organisation should start with creating calculation logic and determining curve dependencies. They went on to talk about how ZEMA, ZE’s data management suite, uses advanced filtering, interpolation, extrapolation and other shaping functions to build curves that best fit a series of data points, allowing greater precision and ultimately better decision making.
The ZE Vancouver conference and user forum represents another point of pride for Dr. El-Ramly because of its ability to attract ZE partners, clients and potential clients. This event, and the many others that ZE sponsors throughout the year, is realization of the philosophy that underlies ZE’s business approach; by connecting with others in the industry to exchange ideas, promote awareness of current trends, and envision new strategies, companies will see the advantage, if not the necessity, of using the data management technology that ZE has developed. And ZE knows that the best way to continue to develop its software successfully is by listening to the concerns, questions and needs of those who use its software — its clients.
ZEMA has advanced data collection technology with rigorous security and metadata layers, and integrates and distributes information to an unlimited number of tools and systems. If you want to learn why companies everywhere are recommending ZEMA as the only choice for data collection, analysis and all round systems integration, contact the company that is staying ahead of the curve and changing the way enterprise data management is done. Contact Us.


For media inquiries please contact:

Michelle Mollineaux,
Manager of Marketing & Business Development
Office: 604-244-1469, ext. 216
Direct: 778-296-4189
Fax: 604-244-1675
Email: michelle.mollineaux@ze.com