It's now easy to access multiple live data sets from NEOpoint using your favourite tools like Excel, R, Matlab or Python, e.g. in Excel you can now run and load data from multiple reports with just the click of a button. Its easy to do and a lot easier than manually copying and pasting data, especially for repetitive analyses. NEOpoint has full historic data for the NEM, gas, BOM etc and using the data services feature you can easily access, all without installing any drivers - it just uses the normal web services features built in to most analysis tools.
Both NEOpoint and NEO provide data services so you can easily load data directly into your favourite third party tool such as Excel, R, Matlab, Python. All these tools offer a single command “web query” which takes a URL. Each tool expects a different data format, either XML, JSON or CSV so NEOpoint and NEO support all these formats.
NEOpoint data service can be run from any machine connected to either IES NEOpoint server or
customer's own NEOpoint server. Link URL for NEOpoint data service uses Favourite path and can be found by
clicking on the "Get Links" button.
NEOexpress data service have connects to IES NEOexpress server and are only accessible on the PC that is running NEOexpress client. It uses Report paths whereas NEOpoint uses Favourite path. You can simply right click on any chart and select "Copy Service URL" to get the data service link.
Data service in QueryTool allows you to run query directly (compare to NEOpoint and NEOexpress which run favourites and reports). For example of how to run data service using QueryTool, please see section running SQL queries for each different tools below.
URL links example for NEOpoint and NEOexpress is shown below:
NEOpoint:
https://www.neopoint.com.au/Service/xml?f=101+Prices%5cRegion+Prices+5min&period=Daily&from=2015-12-21&instances=NSW1§ion=0&key=KEYXXX
NEOexpress:
http://neoweb3.iesys.com.au/xml?report=Australia%5cMMS+R6%5cDispatch%5c1+Price%2c+Demand%2c+Flows%5c1+EnergyPrice%5cRegion+Price+5min&instances=NSW1&pc=0d1d
As with all the other tools data services allows you to load multiple data sets automatically. We have an XLSM file that demonstrates a simple case of loading 5 min price and demand from two separate reports. The macro is easy to modify to load any report, time period or instances. Excel sample
Excel expects web query data to be in XML format. You can either load it manually, or if this is something you do on a regular basis you may want to create a macro. The advantages of macros is that you can instantly load any number of live data sets and perform calculations on them all with the press of a button.
To start, with open NEOpoint and run the required report and then click the “Links” button, select and copy XML link. Sample XML link: https://www.neopoint.com.au/Service/Xml?f=104+Bids%5cRegion+Merit+Order+Stack+5min&period=Daily& from=2015-10-13&instances=Generator;NSW1§ion=0&key=***** XML link has several parameters that you can customize.
To manually load a result follow these steps:
The big advantage of data services is automatically loading results. This makes it easy to create spreadsheets that with the click of a button will load several data sets and perform calculations. The macro text below shows how to load data automatically. You can simply copy the text for the macro and paste into your own copy of an Excel spreadsheet, modifying the URL as required. You can also set where you want the data to be loaded and load additional data sets in the same macro by repeating the XmlImport line with the required URL for each additional data set. To understand the macro we also need to explain two other lines of code:
The sample macro code is shown below.
Sub Macro1() ' ' Get prices ' For Each XmlMap In ActiveWorkbook.XmlMaps XmlMap.Delete Next Columns("G:H").ClearContents ActiveWorkbook.XmlImport URL:= _ "https://www.neopoint.com.au/Service/Xml?f=101+Prices\Dispatch+Prices+5min&period=Daily&from=2015-10-13&instances=§ion=0&key=*****" _ , ImportMap:=Nothing, Overwrite:=True, Destination:=Range("$G$1") End Sub
R is a very popular open source language based on the S language and used for statistical data analysis. The simplest way to load data is via the CSV format.
Sample code below loads data from a 5min price report and plots it. Note that this method runs a NEOpoint report rather than an actual query.
w=read.csv(file="https://www.neopoint.com.au/Service/Csv?f=101+Prices%5CDispatch+and+Predispatch+Prices+5min&from=2018-11-05&period=Daily&instances=§ion=-1&key=*****") plot(w$DateTime, w$NSW1.Price) lines(w$DateTime, w$NSW1.Price) rdate<-strptime(w$DateTime, "%Y-%m-%d %H:%M:%S" ) plot(rdate, w$NSW1.Price) lines(rdate, w$NSW1.Price)
To run SQL queries you need a QueryTool enabled account. Because queries tend to be quite large the HTTP GET method used above cannot be used due to the ~2K limit on the length of a URL, therefore the POST method must be used. The sample code below shows how execute a query. It uses the httr library
library(httr) query <- "select `SETTLEMENTDATE`, `RUNNO`, `REGIONID`, `PERIODID`, `RRP`, `EEP`, `INVALIDFLAG`, `LASTCHANGED`, `ROP`, `RAISE6SECRRP`, `RAISE6SECROP`, `RAISE60SECRRP`, `RAISE60SECROP`, `RAISE5MINRRP`, `RAISE5MINROP`, `RAISEREGRRP`, `RAISEREGROP`, `LOWER6SECRRP`, `LOWER6SECROP`, `LOWER60SECRRP`, `LOWER60SECROP`, `LOWER5MINRRP`, `LOWER5MINROP`, `LOWERREGRRP`, `LOWERREGROP`, `PRICE_STATUS` from mms.tradingprice where SETTLEMENTDATE between '2018-11-1' and '2018-11-2' limit 0, 10000" requ <- list( query = query, key = "*****", format="csv" ) res <- POST("https://www.neopoint.com.au/data/query", body = requ, encode = "form", verbose()) tabtext <- content(res, "text") tab <- read.csv(text = tabtext, header = TRUE)
The above might look a bit tedious but it is easy to create a function that will wrap all of that up so that running a query just becomes a single function call. In the code sample below we define a function to run a query then we call it. The function takes a query string and returns a data frame.
RunQuery <- function(query) { requ <- list( query = query, key = "*****", format="csv" ) res <- POST("https://www.neopoint.com.au/data/query", body = requ, encode = "form", verbose()) tab <- content(res, "text") read.csv(text = tab, header = TRUE) } RunQuery("select * from mms.tradingprice where SETTLEMENTDATE between '2018-11-2' and '2018-11-3'")
Python was probably the best of the tools we reviewed! It is a high level scripting language with a very readable syntax making it easier to understand and relay information. It is supported by a very active community of libraries and IDEs and best of all it's free and open source.
The Python source code for loading some data from a NEOpoint report and plotting it is shown below. We use the pandas and matplotlib libraries in this example. pandas is a data management and data analysis library and matplotlib is a plotting library; along with numpy, these are some of the most common libraries and come pre-installed in most python distributions.
# Imports import pandas import matplotlib.pyplot as plt # CSV Link url = 'http://neopoint.com.au/service/csv?key=*****§ion=0&f=101+Prices%5cDispatch+Prices+5min&period=Daily&from=2015-04-09&instances=' # Read Data and Plot data = pandas.read_csv(url) data.plot() plt.show()
The chart displayed from the above script is displayed below
The Python snippet for loading some data from an SQL query is shown below. Note the code uses the HTTP POST method because the query length will usually exceed the allowed URL length for a GET.
# Imports import requests import io import pandas # Query to run query = '''select `SETTLEMENTDATE`, `RUNNO`, `REGIONID`, `PERIODID`, `RRP`, `EEP`, `INVALIDFLAG`, `LASTCHANGED`, `ROP`, `RAISE6SECRRP`, `RAISE6SECROP`, `RAISE60SECRRP`, `RAISE60SECROP`, `RAISE5MINRRP`, `RAISE5MINROP`, `RAISEREGRRP`, `RAISEREGROP`, `LOWER6SECRRP`, `LOWER6SECROP`, `LOWER60SECRRP`, `LOWER60SECROP`, `LOWER5MINRRP`, `LOWER5MINROP`, `LOWERREGRRP`, `LOWERREGROP`, `PRICE_STATUS` from mms.tradingprice where SETTLEMENTDATE between '2018-11-01' and '2018-11-02' limit 0, 10000''' # payload for POST request url = 'https://www.neopoint.com.au/data/query' params = { 'query': query, 'key':'*****', 'format':'csv'} # Send HTTP request response = requests.post(url, params=params) # Parse the response data = pandas.read_csv(io.StringIO(response.content.decode('utf-8')), quotechar='\"', delimiter=',', quoting=csv.QUOTE_ALL, skipinitialspace=True, escapechar='\\') # Remove spaces in the column names data.columns = data.columns.str.strip()
GS can read data in XML or CSV format but we recommend CSV format. The screen shot below shows reading data in by pasting a command like: =ImportData("https://www.neopoint.com.au/Service/Csv?f=101+Prices%5CDispatch+Prices+5min&from=2017-12-10&period=Daily&instances=§ion=-1&key=xxxx")
Alternatively if you want to read the data using a script you can use the sample below as a guide.
function myFunction() { var csvUrl = "https://www.neopoint.com.au/Service/Csv?f=101+Prices%5CDispatch+Prices+5min&from=2017-12-10&period=Daily&instances=§ion=-1&key=xxxx"; var csvContent = UrlFetchApp.fetch(csvUrl).getContentText(); var csvData = Utilities.parseCsv(csvContent); var sheet = SpreadsheetApp.getActiveSheet(); sheet.getRange(4,4, csvData.length, csvData[0].length).setValues(csvData); }