R systematic trading

By: Dmi3i Date: 14.06.2017

It is a well known and recognized data feed provider geared toward retail users and small institutions. Stanislav Kovalevsky has developed a package called QuantTools. It allows to download and organize historical market data from multiple sources like Yahoo, Google, Finam, MOEX and IQFeed.

The feature that interests me the most is the ability to link IQFeed to R. More information can be found here. QuantTools offers four main functionalities: First make sure that IQfeed is open. You can either download daily or intraday data.

Cointegrated Time Series Analysis for Mean Reversion Trading with R | QuantStart

The below code downloads daily prices Open, High, Low, Close for SPY from 1st Jan to 1st June Note the period parameter. It can take any of the following values: QuantTools makes the process of managing and storing tick market data easy. You just setup storage parameters and you are ready to go. The parameters are where, since what date and which symbols you would like to be stored. Any time you can add more symbols and if they are not present in a storage, QuantTools tries to get the data from specified start date.

The code below will save the data in the following directory: There is one sub folder by instrument and the data is aved in.

r systematic trading

You can also store data between specific dates. In the example below, I first retrieve the data stored above, then select the first price observations and finally draw the chart. Two things to notice: You can refer to the Examples section on QuantTools website. Overall I find the package extremely useful and well documented.

The only missing bit is the live feed between R and IQFeed which will make the package a real end to end solution. A few months ago a reader point me out this new way of connecting R and Excel. BERT stands for Basic Excel R Toolkit. At the time of writing the current version of BERT is 1. From a more technical perspective, BERT is designed to support running R functions from Excel spreadsheet cells.

By combining the power of XML, VBA, R and BERT I can create a good looking yet powerful application in the form of an Excel file with minimum VBA code. Ultimately I have a single Excel file gathering all the necessary tasks to manage my portfolio: In the next sections I present the prerequisite to developed such an approach and a step by step guide that explains how BERT could be used for simply passing data from R to Excel with minimal VBA code.

Once the installation has completed you should have a new Add-Ins menu in Excel with the buttons as shown below. This is how BERT materialized in Excel. The Custom UI Editor allows to create user defined menus and buttons in Excel ribbon. This is what we want to retrieve in Excel.

Save this in a file called myRCode. R any other name is fine in a directory of your choice. In this file paste the following code.

Trading Account for Online Share Trading | Buy Shares Online - Religare Online

This is just sourcing into BERT the R file you created above. Then save and close the file functions. Create and save a file called myFile. This is a macro-enabled file that you save in the directory of your choice. Once the file is saved close it. Once the file is open, paste the below code.

You should see something like this. Paste the code below in the newly created module. You should see something like the below appearing.

BLARE

From my perspective the interest of such an approach is the ability to glue together R and Excel obviously but also to include via XML and batch pieces of code from Python, SQL and more.

This is exactly what I needed. Finally I would be curious to know if anyone has any experience with BERT? Making the most of the out of sample data August 19, , 9: Then a comparison of the in and out of sample data help to decide whether the model is robust enough. This post aims at going a step further and provides a statistical method to decide whether the out of sample data is in line with what was created in sample.

There is a non-parametric statistical test that does exactly this: Using the Kruskal-Wallis Test , we can decide whether the population distributions are identical without assuming them to follow the normal distribution. It exists other tests of the same nature that could fit into that framework. Then I tested each in sample subset against the out of sample data and I recorded the p-values.

This process creates not a single p-value for the Kruskall-Wallis test but a distribution making the analysis more robust.

As usual what is presented in this post is a toy example that only scratches the surface of the problem and should be tailored to individual needs. This initial version is a wrapper around the getSymbols function in the quantmod package and only Yahoo, Google, FRED and Oanda are supported. As usual with those things just a kind reminder: This is a very first version of the project so do not expect perfection but hopefully it will get better over time.

Please report any comment, suggestion, bug etc… to: Doing quantitative research implies a lot of data crunching and one needs clean and reliable data to achieve this. What is really needed is clean data that is easily accessible even without an internet connection.

The most efficient way to do this for me has been to maintain a set of csv files. I have one csv file per instrument and each file is named after the instrument it contains.

The reason I do so is twofold: Simple yet very efficient so far. The process is summarized in the chart below. In everything that follows, I assume that data is coming from Yahoo.

The code will have to be amended for data from Google, Quandl etc… In addition I present the process of updating daily price data. The code below is placed on a. Note that I added an output file updateLog. The process above is extremely simple because it only describes how to update daily price data. The Asset Management industry is on the verge of a major change.

Over the last couple of years Robots Advisors RA have emerged as new players. The term itself is hard to define as it encompasses a large variety of services. I found the Wikipedia definition pretty good. When RA get their names on TV adds or on the roof of NYC cab you know something big is happening….

In this post R is just an excuse to present nicely what is a major trend in the asset management industry. Those figures are a bit dated given how fast this industry evolves but are still very informative.

It is starting to significantly affect the way traditional asset managers are doing business. Despite all the above, I think the real change is ahead of us. Ultimately it will affect the way traditional investment firms do business. Active portfolio management which is having a tough time for some years now will suffer even more. Another potential impact is the rise of ETFs and low commission financial products in general. Obviously this has started a while ago but I do think the effect will be even more pronounced in the coming years.

This trend will get stronger inevitably. Some of the functions presented here are incredibly powerful but unfortunately buried in the documentation hence my desire to create a dedicated post. I only address daily or lower frequency times series.

The example below loads the package and creates a daily time series of days normaly distributed returns. The join argument does the magic! Apply a specified function to each distinct period in a given time series object. Extract index values of a given xts object corresponding to the last observations given a period specified by on. Generic function for replacing each NA with the most recent non-NA prior to it. For a set of returns, create a wealth index chart, bars for per-period performance, and underwater chart for drawdown.

This is incredibly useful as it displays on a single window all the relevant information for a quick visual inspection of a trading strategy. The list above is by no means exhaustive but once you master the functions describe in this post it makes the manipulation of financial time series a lot easier, the code shorter and the readability of the code better.

When it comes to managing a portfolio of stocks versus a benchmark the problem is very different from defining an absolute return strategy. I strongly encourage anyone with an interest in the topic to read the book from the beginning to the end. In this post I focus on two simple and widely used metrics: The IC gives an overview of the factor forecasting ability.

Obviously ICs must be as high as possible in absolute terms. The usefulness of this tool is straightforward. A factor can have a good IC but its predictive power might be limited to a small number of stocks. Obviously there is a survival ship bias: This measure is much less sensitive to outliers than arithmetic mean. This is very significant and powerful for such a simple factor not really a surprise though….

Therefore there are greater chances to beat the index by overweighting the stocks falling into Q5 and underweighting those falling into Q1 relative to the benchmark. An IC of 0. Formal significance tests can be evaluated but this is beyond the scope of this article. In the hedge fund world people have very low tolerance for drawdown. Generally, as a new trader in a hedge fund, assuming that you come with no reputation, you have very little time to prove yourself. You should make money from day 1 and keep on doing so for a few months before you gain a bit of credibility.

In the chart below I simulated the experiment with one of my strategies. You start trading in 1st June and all goes well until 23rd Jul. If you have kept the allocation unchanged, the high water mark level would have been crossed on 28th Oct. Still on the chart above, assume you get really unlucky and you start trading toward mid-June You would have started in early August your allocation would not have been cut at all and you end up doing a good year in only 4 full months of trading.

In those two examples nothing has changed but your starting date…. If managed properly you have a chance to stay in the game long enough to realise the potential of your strategy. I remmember struggling for days with a simple Wifi connection because drivers were not readily available. Things have changed dramatically since then. Last week I installed Linux Ubuntu In this post I explain step by step what I did: The latest version can be obtained from CRAN.

I used a small utility called gksudo to open and modify the sources. In the command line type the following:. This will open the sources. You just need to add the repository above then save and close. There are other ways of doing this but adding an entry to the sources. Ubuntu uses apt for package management. Apt stores a list of repositories software channels in the sources.

By editing this file from the command line, software repositories can be added or removed. All this might not be perfect but it worked for me without a glitch. The R Trader Using R and related tools in Quantitative Finance.

Linking R to IQFeed with the QuantTools package June 4, , 7: FInancial Data LoadeR April 21, , 8: Maintaining a database of price files in R December 13, , 2: The Rise of the Robots Advisors… August 15, , 9: R financial time series tips everyone should know about July 7, , 8: Factor Evaluation in Quantitative Portfolio Management March 23, , 8: Recent Posts Linking R to IQFeed with the QuantTools package BERT: Making the most of the out of sample data Introducing fidlr: Entries RSS and Comments RSS.

Valid XHTML and CSS. Powered by WordPress and Fluid Blue theme.

inserted by FC2 system