Categories
Coding

Webank Event at NESTA

Just attended the webank event at NESTA tonight, which was well attended with a capacity crowd of finance industry entrepeneurs, lawyers, VCs and bloggers. Three interesting companies presented at the event, each with their own angle on p2p finance:

Zopa presented a brief introduction to their model and gave some interesting statistics about their typical user base. Zopa (UK at least) seems to be profiting handsomely from the current credit market difficulties, although it seems that they may not be quite yet in the black. Their US operation seems to be less of a p2p operation. Average returns apparently around 9%, with a default rate of around 0.2%.

Kubera Money presented on their vision of a p2p lending platform based on the chit fund, or ROSCA model. Not too much detail regarding the actual implementation of how the schemes will work in the distributed world, but sounds like they have the FSA regulatory hurdles cleared, so hopefully will progress fast.

Midpoint Transfer was the last startup to present, and their business model revolves around disintermediating FX dealers by offering FX rates at the midpoint (no spread) with all trades offset via a linked network of midpoint-controlled intermediary accounts. Trades are guaranteed to be matched same day and midpoint takes on the credit risk. There is a flat $30 fee per transaction. Having worked in FX for a couple of years, I am sceptical about a couple of things, mainly around their guaranteed matching (which necessitates them having to hedge any unmatched exposure themselves, which will be a common occurrence unless they have thousands of users), and the fact that the midpoint is not set until the trade is actually matched, which could entail a very unfavourable price for a customer if a trade is not matched until end-of-day during a period of high volatility. If this is the case, there is a real risk that customers in this case would have been much better off by just dealing through a broker.

Afterwards, there was an entertaining verbal sparring match between the Zopa CEO and James Gardner, the head of innovation at a large retail bank, who both argued their views eloquently. Met some interesting people at the event – there are a lot of innovative ideas coming down the line in p2p finance.

Categories
Coding

Debugging DLL Loading Issues in Windows

Attempting to debug dynlib loading issues in Windows can be tricky – on Linux you can use strace to monitor dynamic linking and see what libraries are being loaded as the application runs. Having just had some issues with dynlib loading issues in an R extension I wrote, I was keen to find a good way to debugging the issues as they occurred. Thankfully, there are some good utilities out there for debugging this kind of thing:

Dependency Walker is an old SDK tool that is still invaluable. Drag a .DLL or .EXE into the application window and it will show you the dependencies, what type of loading mechanism they use (e.g. eager or delay load), and also what exported functions from the dependent DLLs are actually used by the .DLL or .EXE you have selected.

Here is a screenshot of depends.exe operating on my R extension DLL. Note that MSCVR80.DLL and R.DLL are both highlighted as missing – this is extremely useful for figuring out dependent DLL load issues.

Dependency Walker
Dependency Walker

The SysInternals Process Monitor is an incredibly useful low-level Swiss Army Knife utility that can be used, among other things, to monitor dynamic library loading activity as it occurs, using the file activity view. Here is a view of Process Monitor monitoring R as it loads my extension DLL. The subsequent dependent DLL loading activity is highlighted.

Process Explorer
Process Explorer
Categories
Coding Finance R

R/Reuters Real-Time Data Extension

Last week, I posted an R extension DLL that downloads historical data from Reuters using the SFC API. This time around, I am posting an extension DLL that can subscribe to real-time updates using the RFA API. This extension allows you to specify a list of items and data fields, and subscribe for a specified amount of time to updates on those items.

Here is an example:

# Load the DLL
dyn.load("RfaClient")

# Low-level subscription function wrapper
rsub < - function(time, items, func, sessionName, config, debug) { .Call("subscribe", as.integer(time), items, func , sessionName, config, debug) } # Define items and fields to subscribe to items <- list() fields <- c("BID","ASK","TIMCOR") items[[1]] <- c("IDN_SELECTFEED", "GBP=", fields) items[[2]] <- c("IDN_SELECTFEED", "JPY=", fields) # Callback function (invoked when data items update) callback <- function(df) { print(paste("Received an update for",df$ITEM, ", update time=", df$TIMCOR)) } # Subscribe for 5 seconds using the supplied config parameters rsub(5000, items, callback, "clientSession", "rfa.cfg", FALSE)

A short introductory guide is supplied:

Introduction (PDF)
Introduction (PDF)

The functionality is pretty basic right now, and there may be issues with the code (e.g. memory leaks, performance issues). Any feedback is gratefully received.

The package, including source, can be downloaded here:

rfaclient.zip

It was built with R 2.8.0 and RFA C++ 6.0.2, using Visual C++ 2005.