COMPOSITE SECTOR ETF VALUATION UPDATED [5.24.2015]

Check out my updated IPython Notebook where I take a look at changes and trends in ETF valuations using the Implied Cost of Capital model. To learn more about the model and the methodology used see here and here

Composite Sector ETF Valuation updated [5.24.2015]

Composite Sector ETF Valuation updated [5.10.2015]

Check out my updated IPython Notebook where I take a look at changes and trends in ETF valuations using the Implied Cost of Capital model. To learn more about the model and the methodology used see here and here

Composite Sector ETF Valuation updated [5.10.2015]

Sector ETF Valuation Using the Implied Cost of Capital (ICC) Model

This post is part of a series examining the ICC model's use as a valuation tool. I first introduced the topic in this post, where  I outlined the following:

  • how I calculate the ICC formula for use in this sector ETF relative valuation model
  • my assumptions for the model
  • expected model output and sanity check
  • why and how I use the model results to enhance my investing

Recently I expanded on the subject by detailing the Python code I use to run the analysis along with my interpretations of the output from the model.  For detailed coding/quant analysis I will be using the IPython Notebook and NBviewer to distribute and share the code. Unfortunately, Squarespace.com (my current host) doesn't have a good way to show the IPython notebook, so I will post the link to my research with a screenshot below. Take a look and as always I can be contacted @blackarbsCEO for feedback.

How to get Free Intraday Stock Data from Netfonds

Daily stock data is everywhere for free. Yahoo, Google, and Quandl all provide useful daily stock prices for basic number crunching. However computational analysis for intraday stock data is much harder to find. In fact, Intraday stock data can be very expensive. So what is a cost conscious quant supposed to do?

The Norwegian website Netfonds.no provides free intraday data on stocks and ETF's on the NYSE, Nasdaq, and Amex exchanges. They provide up to 5 days of trade/bid/offer data. I wrote some code to grab that data easily and quickly that I will share with the Python community.

Before I paste the code below let me give a h/t to Yves Hilpisch who wrote the excellent 'Python for Finance' book where I borrowed some of the following code. 

I've embedded the entire script using Nike ( NKE ). Feel free to ask questions and/or add your own touches. 

 

Here are the resulting sample plots.


Project Update_iVC Reporting Engine

Still working industriously behind the scenes I thought to take some time and give a progress report. Good news is the iVC Reporting Engine is almost fully operational. I've been able to automate the following processes:

Data Collection: I have two methods to obtain public company filings from the SEC via Python scripts.

  • The primary method I use leverages the excellent services of the free (for now) data provider, Quandl.com. They aggregate and distribute data from several primary sources including the SEC and normalize as best as possible given whatever business constraints they have as a young riser. In congruence with their mission statement to improve data accessibility for all, they have also provided a Python module and general API for the motivated entities out there to automate data queries. 
  • Additionally I have a backup database of every public company filing since 2012, which leverages Arelle's open-source efforts to standardize and improve the XBRL standard for finance-IT adoption. My db is configured to use the PostgreSQL standard with Arelle's schema. This still requires more examination of the XBRL format, and/or command line interfaces to return the data I require. Marked: in Development for production purposes.
  • Data Processing: There is overlap with the above, however this includes organizing/filing of the data returned by the collection process. Not a lot to say here other than the files are self-organizing and allow for flexible structures during continued operational development. 

Calculation Engine: This is pretty much the final major component before large scale testing can begin. It is broken down as follows. 

  • I've been able to transition ~99% of my previous Excel and Python calculations into this component. It accesses the processed data, and strips it down to run the calculations as efficiently as possible currently averaging ~15.5 secs per cycle. 

Under Construction: All that's left really is to incorporate the plotting/charting functions and final production output format. Once that is complete live production testing can begin.

You better tell somebody, Blackarbs LLC is coming.