I have an Excel spreadsheet that I use to manage my finances. Over the years, it has grown in complexity and I had automated it more and more with VBA macros. The one thing I still did manually was input the current values of stocks and mutual funds I own.
((wget http://www.nasdaq.com/symbol/msft).AllElements | where id -eq "qwidget_lastsale").innerText
This sparked my interest to automate this part of my spreadsheet as well.
Poking around the source code of my favorite financial news website, I noticed that the current price of the financial instruments I was interested in could be identified by a certain CSS class that was applied to a span element surrounding them.
Using PowerShell works
The first version I came up with was basically just executing the above statement (with some modifications to filter by class) on the command line.
Sub UpdateQuoteWithPowershell(sUrl As String, sClass As String, sTargetCell As String) Dim sCommand As String sCommand = "powershell ((wget " & sUrl & ").AllElements | where class -eq " & sClass & " | Select-Object -first 1).innerText" Set objShell = CreateObject("WScript.Shell") Set objExecObject = objShell.Exec(sCommand) Dim sQuote As String sQuote = objExecObject.StdOut.ReadAll() CleanAndWriteQuote sQuote, sTargetCell End Sub
It’s straight forward (all I had to do was search online how to execute a command line from VBA) and works. Unfortunately, it’s also rather slow, taking a couple of seconds for each quote. Furthermore, it had the ugly side-effect of showing a command window while the script was running.
Using MSXML is faster
So I did some more online research, eventually combining the approaches from multiple sources. I came across a couple of people using the Microsoft XML Parser (MSXML) to download and parse webpages. My first instinct was to download the webpage, and basically do the same kind of filtering for the first element with the correct class-attribute. This ran into problems, as the HTML pages in question aren’t parseable as XML.
Thus I ended up using the XMLHTTP60 object merely as a means to do the download (I know there are probably other ways to do this, e.g. by utilizing IE). Finding the information I need was done using the Split function.
Sub UpdateQuoteWithMSXML(sUrl As String, sClass As String, sTargetCell As String) Dim vResult As Variant Dim xmlHttpRequest As MSXML2.XMLHTTP60 Set xmlHttpRequest = New MSXML2.XMLHTTP60 xmlHttpRequest.Open "GET", sUrl, False xmlHttpRequest.send If xmlHttpRequest.readyState <> 4 Or xmlHttpRequest.Status <> 200 Then Sheet1.Range(sTargetCell).Value2 = CVErr(xlErrNA) Else Dim sQuote As String sQuote = Split(Split(xmlHttpRequest.responseText, "span class=""" & sClass & """>", , vbTextCompare)(1), "<")(0) CleanAndWriteQuote sQuote, sTargetCell End If End Sub
NB: This approach requires you to add a reference to "Microsoft XML, v6.0" to your VBA project via the Tools -> References… menu.
In the interest of completeness, here’s the helper method CleanAndWriteQuote I use in the snippets above:
Sub CleanAndWriteQuote(sQuote As String, sTargetCell As String) sQuote = Replace(sQuote, ",", ".") sQuote = Replace(sQuote, " EUR", "") Sheet1.Range(sTargetCell).Value2 = Val(sQuote) End Sub
Any error handling (e.g. when there is an error downloading the webpage or when no span with the given class is found) is left as an exercise to the reader.
I also wrote a wrapper-method that calls the above Update method with the data for each of the financial instrument I own. This method is hooked up to the workbook’s Open event, so whenever I open up my spreadsheet it automatically refreshes. Saves me quite a bit of time every day.
Next, I’m thinking about storing the previous day’s quote, so I can include an indicator whether my holdings went up or down.