portfolio (Q2 2017) - update 2

swarm city - 40.56%

ethereum - 43.95%

aeternity - 9.59%

bitcoin - 1.5%

qrl - 4.34%

other bets:

  1. urbit (pre-launch, so current effective valuation should probably be $0 despite costs associated with purchase)

portfolio (Q2 2017) - update 1

swarm city - 42% (reflects appreciation of underlying asset from $2.15 purchase price to current market rate of $3.50 per)

ethereum - 50%

bitcoin - 8% (includes asset and related derivatives used for hedging)

other bets:

  1. urbit (pre-launch, so current effective valuation should probably be $0 despite costs associated with purchase)
  2. aeternity (same comment as urbit)

portfolio (Q2 2017)

1) ethereum: fundamentally revolutionary technology if we make it work and don't abuse the ease of creating tokens and performing large-scale crowd sales. (portfolio composition - 50%)

2) bitcoin: the first cryptocurrency, with greatest consumer recognition (despite still relatively unknown in the greater scheme of things...). two caveats to this: 1) I tried to buy a froyo this morning and asked the girl if they accepted bitcoin. "what? what's that?" was the response which clearly means the community needs to work on getting bitcoin into the mainstream consciousness or else the experiment will 'fail' for whatever definition of failure you want to apply. and 2) core devs need to get their act together and deal with how to scale the network properly. pretty soon ethereum will eclipse btc market cap and if that happens i would liquidate my position completely for ethereum. (22%)

3) swarm city: the only dApp on ethereum that looks like it has any practical purpose that normal people would use. hopefully the tags system gets iterated on, as it could get pretty ugly in practice. rough start with the first co-founder but still a great project with real value. excited for their first release on june 17! this is the kind of stuff people are interested in, not decentralized supercomputers or prediction markets with dubious real-world value. (18%)

4) aeternity: another blockchain project (sorry...). no other reason I'm involved other than my faith in Zack Hess. really think ethereum will leave these guys in the dust though, they have a lot to demonstrate if they want to stay relevant in the face of ethereum's extremely involved community. also I love erlang so it wins some points right off the bat. (5%)

5) urbit: decentralize All The Things! very weird but very compelling project. excited for the potential this has, and actively working on building out better documentation for the urbit community until better tooling is available to build real-world applications on top of this platform. the idea that we need to tie real-world identity to digital identity is apparent from my involvement with smart contract development on ethereum, and its clear that despite the mental overhead of this urbit universe and the odd linguistic choices, curtis yarvin is a very smart (if not controversial...) guy and this project could easily be the most disruptive. but overall this is the weirdest project of the above by far. (5% - will increase this when they auction off more stars)

Macro cryptoasset commentary (Q2 2017, v. 2) - Ethereum

Since the last commentary post, Ether units (ether) have appreciated in value from $40.15 to a high of $212.50 this week before correcting to around $145 where it sits currently. I expect it to further drop in price - the current social utility of ether have not grown at a pace that matches the increase in price we've seen over the last few months. The largest (read: most well-funded) distributed apps (dApps) built on Ethereum include Aragon, a distributed supercomputing project (a little like BOINC), Gnosis and Augur, two competing platforms that enable decentralized prediction markets, and the now-defunct TheDAO, which raised $150M before its core smart contract code was exploited, draining the investment vehicle of a third of its capital. This debacle led to one of several subsequent forks in the Ethereum blockchain, with two competing cryptoasset systems now in existence. 

I found it extremely amusing to wake up after the fork to find that Kraken had credited 375 ETC (Ethereum Classic tokens) to me without any action needed on my part because, well... its a separate asset after all! On a separately maintained divergent blockchain and everything! ... Its even funnier when you know the history of cryptocurrencies paraded as a solution to double-spending problems, which is true unless a group of renegades decide to fork your protocol and call their forked version of your cryptocurrency something else. All of this happened because some of the Ethereum core developers had a large stake in TheDAO, and so they were biased toward a solution that would "rewrite history" and re-credit the ether balances of users that had invested in TheDAO's initial fundraising round. I agree with what was done because of the technology's immaturity and because mistakes should be corrected as the technology moves toward true maturity. All that said, there is absolutely no chance that Ethereum truly merits a >$15B market cap at this time, from a pure investment value sense, or from a technology potential sense - there are serious unresolved issues with oracles (getting outside real-world data into the blockchain), and a very weird obsession with prediction markets, despite the observed failure of prediction markets to actually predict Brexit or the US presidential election of the king of carrots. The various development teams that are working on dApps should have the ability to exchange ether for dollars so they can buy servers and ramen/beer so they maintain their sanity while they work on new innovations (this is the whole point of secondary markets after all), but ICOs are now clearly being used as a mechanism to enrich opportunists that have, under most legal interpretations that I've heard (from securities lawyers who have handled offerings of new securities/new investment vehicles), violated various SEC statutes by selling instruments to the public with almost no real documentation of what the raise is for and what they intend to do with the funds once dispersed. On the back of the relatively new laws surrounding crowdsales, its encouraging to see how quickly this new technology enables any group of individuals to rapidly raise huge sums of money, but I urge potential investors that seek to get involved in ICOs to be very careful of what they are getting involved in, and to perform proper due diligence of these projects and to vet any developers working on projects of note prior to transmitting any money to these projects. Most are complete vaporware and I fear that many less informed investors and members of the public will lose huge sums of money in trying to get exposure to cryptocurrencies and crypto-assets. I think the idea that in 10 years every major tech company will be replaced by a dApp with its own unit of exchange is beyond stupid - no one wants to deal with all this mental overhead just to use your toy. There won't be thousands of decentralized applications all interacting in an autonomous, distributed way - fundamentally corporations serve the needs of people and are a vehicle to create value by transforming a person's time and skills into a well-defined product or service that another person can use or consume. This exchange happens because the buyer sees the good/service as more valuable than whatever they exchanged to acquire it. That's all - no magic. These digital assets have utility because I can transmit money without intermediaries, which has the potential to remove banks, brokerage houses, and other consumer financial service providers from the picture - but we're not there yet. I'm working on a consumer lending application on Ethereum because Solidity provides a relatively straightforward way of programmatically transmitting and storing value in a way that could automate the lending flow completely, but the interfacing of a dApp with real-life human service providers (checking accounts, investment accounts) is challenging and requires significant development to simplify to the point of making viable alternatives that people will really be able to use. It is much more likely that Google's secret artificial general intelligence DeeperMind* wakes from its slumber next year and enslaves humanity compared to a future where hundreds of competing cryptocurrencies are used in daily life by normal non-cryptofanatic people. To accumulate all this power so quickly in order to execute this magnificent coup, it'll very likely use bitcoin to transact with service providers and buy server time/storage... of course.

My friends and colleagues have indicated their interest in purchasing Bitcoin and Ether and this is exclusively because of the speculative appeal of its price volatility as opposed to being genuinely interested in the underlying technology's possibilities. This begs the question of whether the explosion in initial coin offerings (ICOs) should be considered as more of an innovation in fund-raising technology rather than true, organic growth in the global usage of cryptocurrencies as a mechanism to buy goods and services in the real world or on the Internet.

* this is not real

Macro cryptoasset market commentary (Q1 2017, v. 1)

Bitcoin technical limitations becoming increasingly visible (as of mid-March, we're seeing txn fees of $2.75 due to txn block size limits)

Ethereum has largest active development community by far, despite #2 market cap. 

Dash's meteoric rise appears obviously fraudulent and lacks focus on the technicals, and especially on substantiating its press release propaganda by demonstrably releasing something with real world use-cases, solving a real world problem and not existing for its own sake (or for the sake of sustaining the wealth of its creators')

Blockchain Capital ICO seems worth a little attention, but I'd advise against making a placement exclusively based on the 5yr~ lockup mentioned in the CoinDesk article describing their intentions

"Blue oceans for banks"

Why is it that startups like facebook and whats app, that have been in business for only a decade ,can have billions of users but banks that have been around for centuries struggle to expand their global presence?
We believe that in a race to compete with rivals, banks have lost focus and failed to look at the bigger picture.
In today's interconnected and international world the strategies that have been traditionally employed  for market expansion are only leading to stagnation. Rather than competing head on with rivals, we need to realign our focus and create new markets that make the existing competition entirely irrelevant. We need to look at the bigger picture.
Based on three months of research we've created a simple plan that can help our partner bank create blue oceans of uncontested market space for itself and in the process become a truly global entity that can serve the needs of trillion dollars, recession proof industries.

The underserved markets
“Companies can succeed not by battling competitors, but rather by creating ″blue oceans″ of uncontested market space” (sounds similar to Thiel's 'Zero to One' message about competition and the deliberate creation of monopolies as a fundamental business strategy to employ in early companies)
The standardization of banking services over the last decade has come at the cost of product differentiation and user expansion. There is no way for the customers to tell one bank from the other and hence there’s no reason for them to choose any particular bank. For continued growth a bank needs to invest in conquering new markets and uncovering new sources of customers.  The best markets are the ones that are underserved and already deeply rooted in the lifestyles of users.

Just how sweet is the pie?
-The combined worth of airlines ($750 billion), hotels($550 billion) and video games entertainment (91.95 billion) industries is $1.4 trillion. All of these industries operate internationally, deal with the inefficiencies of cross border transactions and have customers around the globe.
- All of these industries have unmet banking needs. Airlines, for example, loses 2% of its revenue every year to "credit card processing fees" and requires manual effort to  move money for 4% of it's earnings.
- These industries are a rich source of e - KYC(know your customer) data. Airlines alone served 2.8 billion passengers in 2011. Every person who checks into a hotel has to supply his/her ID proof. The exact documents that are required for opening a bank account are verified and maintained individually by these industries.

What will it take to grab a piece?
A partner bank that can reduce the operational costs of these industries stands to gain billions of new customers. No need to invest in new infrastructure or to bet on unproven technologies. Our solution is old school, low tech, financially regulated and immediately implementable.
- We are working on wrinq, a platform that can help banks gain billions of new customers by building a cross collaborative banking solutions.
- We propose truly international bank accounts modelled after the best features of digital wallets: easy user on boarding, ability to hold money in multiple currencies and fast unrestricted exchange of funds within the internal banking network. A network that is as big as facebook.
- The account holder will be charged a fixed account maintenance fee and will not be billed continuously for every transaction that is made. Predictable fees for the users. Huge profit for banks due to the high volume of customers.

The side effects
“A business that makes nothing but money is a poor business.”
The effect that we’ll  have on the industry can’t be measured in just monetary returns. What we’re doing is completely new and has the potential of changing the way people look at international banking and international banking services. With the implementation of our solution there is:-  
- An opportunity for banking apps for cross industry collaboration. A scenario where every customer who books with, say vrigin airlines, gets a bank account with our bankng partnerr.
- An opportunity for bank to make its services more compatible with the lifestyle of the people. A scenario where in game transactions are taking place through the internal banking network of our partner bank.
- An opportunity for banks to change the way people trade internationally by nullifying currency exchange losses with help of multicurrency accounts.

Acquiring and retaining international customers aka The Plan
“Over the past years, tourism has proven to be a surprisingly strong and resilient economic activity and a fundamental contributor to the economic recovery by generating billions of dollars in exports and creating millions of jobs”
According to UNWTO report a staggering 1.1 billion tourists travelled abroad in 2014. That’s about 27 Million tourists a day or 1500 high capacity jumbo jets flying every hour full of adventurers eager to explore the world.
Such is the effect of tourism that we’ve optimized air transport, hotel stays and advertisement campaigns. We’ve special travel packages catering to their needs. We’ve shops that sell exotic products. There is food that tastes different from anything they've tried before. There are clothes that are unlike anything they’ve seen before. 
Every industry is capitalizing on this explosion of global tourism with the noticeable exception of banking.

The poor foreign traveller
While there are places to go, clothes to buy and foods to try the buying experience of an international visitor is surprisingly substandard. The prohibitively high costs of foreign transactions significantly reduce the spending capacity of an individual. There’s an unfortunate state of mind where a person wants to spend but is apprehensive of the high charges that may be levied to each of his/her purchases. There’s membership fees, foreign exchange fees, card reload fees and  basically any fees that you can make up, is levied on the poor visitor.

The dazzling banking service
Away from their home the traveller has to deal with all sorts of inexplicable expenses. We can make life easier for them by:-

Issual of special prepaid payment instruments (PPIs) to the users. Quick KYC, Quick dispersal and no foreign exchange fees. All transactions will be local and there'll be a fixed usage fee. Every member of a family will have their own PPI.
Collaboration with key players in the tourism industry chain to capture customers at various points in their journey. A PPI may be issued when a user lands on the airport/checks into a hotel.
Management of PPI’s with an easy to use mobile application. The user may request for a card, reload a card and view all the transactions done from the card. An option for the user to buy discounted services from our channel partners.
An exchange of existing international travel cards with PPIs to get rid of cash advances by converting a credit only card to a debit only PPI.

Can this get any better?
A successful execution of our plan will:-
Bring more revenue to the banks as the income generated through PPI will be separated from an individual transaction. The bank will charge the card holder upfront and therefore get rid of all the unnecessary hassles. No reload fee would mean that the PPI holder would reload more frequently.
Bring Recurring revenue in the form of cross promotional partnerships. We’ll issue themed PPI’s to increase the brand recognition of our partners. For example we can charge Virgin airlines a small fee to create  a Virgin themed PPI and  display Vrgin specific content on the PPI management app.
Open up international markets for local banks. Banks have the opportunity to capture customers that could otherwise never be acquired. People from all over the world who’d never open an account in a foreign country would gladly purchase a PPI that would reduce their transaction fees.
Allow upgradation of local PPI’s to multi currency international cards (after submission of extended KYC docs) so that the same cards can be reused by the travellers when they visit some other place.

A global tourist bank need not just be a concept. The foundation required to build such an entity is already present. All we need is a bit of goal realignment to realize the immense benefits of a trillion dollar , recession proof and a truly global industry.

originally posted by: 

Akshat Jiwan Sharma: akshat@wrinq.com

Mix IDE for smart contracts

Remember to put the following in the JS web console because this IDE sucks:

web3.eth.defaultAccount = web3.eth.accounts[0]

introduction to digital currency

Many abbreviations/assumptions in explanations that follow as these are merely intended as fast notes related to building up a digital currency system. Any comments/suggestions appreciated. Submit to @arthurcolle on FB/TWTR/GOOGmail

Imagine cash as a text file. For 1. below, imagine that I send you 1token.txt which just has the contents "[1token]"

1. System - dollars as text files 

file contents - [1token] 

(problem: copying file means you now have more cash. No way of identifying who is the intended recipient, which may be a desired feature)

2. System - dollars as text files with sender/recipient 

file contents - [arthur sends barbie 1token] 

(problem: interpretation of duplicates - same as 1)

3. System - dollars as text files with timestamp 

file contents - [arthur sends barbie 1token @ 12:01am] 

(problem: interpretation of duplicates - same as 1)

4. System - dollars as text files with timestamp and random id 

file contents - [arthur sends barbie 1token @ 12:01am ~ xIijUhJKas] 

(problem: interpretation of duplicates - same as 1, but also, how to verify current balance of sender?)

Need way of authenticating the sender's balance!

 Network as central bank - all participants help in verification process. Broadcast transactions to participants, which get bundled into "transaction blocks" whereby the individual constituent transactions are verified (does sender have enough tokens? is recipient a valid address? ... ). Instead of having a verification process where you arranged for some percentage of network participants (just computers running some client-server software that can relay data/transaction info to others in the network) - say 75% of all participants say "yup this tx is legit" you instead randomize the process of verification in order to avoid having someone flood the network with fraudulent verifications. 

This randomization process is called Proof of Work, where an easily verifiable (yet ultimately somewhat arbitrary) computation is undertaken by participants. The first to correctly solve and broadcast the correct solution to the computation is rewarded some fixed quantity of currency, which gives us a way to issue new currency in the proposed system.

The proof of work algorithm is roughly described below:

Currency protocol has a predefined number called the nonce (lets say nonce is 7).

You take the bundle of transactions (just a binary blob of zeroes and ones after all) and then set a variable x = 0.

You solve for a value of x such that the concatenation of (x || transaction-bundle-blob) has an output hash* (another blob of 0s and 1s) that has at least 7 leading zeroes (where 7 corresponds to the aforementioned current value of the currency network protocol's nonce. This nonce is called the current difficulty). After some period of time, you increase the nonce in order to make it more difficult to verify transactions, since you want to eventually have a fixed supply of currency in order to maintain the currency tokens' value in the face of inflation.

(*) output hash: in the bitcoin protocol, this is actually the double-sha256 hash of the (x || transaction-bundle-blob), so sha256hash(sha256hash( x || transaction-bundle-blob)

Benchmarks, pt. 1


stat RAILS PHOENIX improvement
avg latency 975.57ms 244.41ms 3.99x
avg req/s 1.36r/s 3.827r/s 2.81x
max latency 3.97s 1.05s 3.78x
max req/s 5.66s 6.33s 1.12x
total reqs* 2401.66 6444.66 2.68x

The stats above were computed based on 3 runs.
(*) means that these were averaged (took each number per run and
averaged them – for max requests per second this isn’t very meaningful).

They were not run concurrently (phoenix and rails at the same time)
and I gave like 10 seconds of breathing time between each run.

Phoenix had 4x lower latency, ~3x more requests per second on average, the average maximum latency was ~4x lower, approximately the same in terms of maximum averaged requests per second (this is kind of a meaningless number), and in terms of average total number of requests attained over the course of the runs was 2.68x greater.

Phoenix – killin’ the game.

Below are the numbers. If you find any error in interpretation/understanding/arithmetic,
please reach out to me.

I’d greatly appreciate any input! 


➜  ~  wrk -t25 -c25 -d60s "https://trophus.herokuapp.com"
Running 1m test @ https://trophus.herokuapp.com
  25 threads and 25 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   386.79ms  180.51ms 848.84ms   76.00%
    Req/Sec     1.56      0.51     2.00     56.00%
  2262 requests in 1.00m, 29.99MB read
  Socket errors: connect 0, read 0, write 0, timeout 183
Requests/sec:     37.65
Transfer/sec:    511.14KB
➜  ~  wrk -t25 -c25 -d60s "https://trophus.herokuapp.com"
Running 1m test @ https://trophus.herokuapp.com
  25 threads and 25 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.23s   792.82ms   3.13s    42.59%
    Req/Sec     1.24      1.37     9.00     78.92%
  2360 requests in 1.00m, 31.36MB read
  Socket errors: connect 0, read 0, write 0, timeout 8
Requests/sec:     39.28
Transfer/sec:    534.39KB
➜  ~  wrk -t25 -c25 -d60s "https://trophus.herokuapp.com"
Running 1m test @ https://trophus.herokuapp.com
  25 threads and 25 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.31s     1.50s    7.92s    86.54%
    Req/Sec     1.28      1.13     6.00     59.77%
  2583 requests in 1.00m, 34.27MB read
  Socket errors: connect 0, read 0, write 0, timeout 20
Requests/sec:     42.99
Transfer/sec:    584.08KB


➜  ~  wrk -t25 -c25 -d60s "https://www.trophus.com"
Running 1m test @ https://www.trophus.com
  25 threads and 25 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   291.59ms  176.61ms   2.17s    88.73%
    Req/Sec     3.22      1.12     5.00     68.33%
  5601 requests in 1.00m, 30.52MB read
Requests/sec:     93.24
Transfer/sec:    520.27KB
➜  ~  wrk -t25 -c25 -d60s "https://www.trophus.com"
Running 1m test @ https://www.trophus.com
  25 threads and 25 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   225.57ms   54.49ms 526.81ms   86.79%
    Req/Sec     4.06      0.80     6.00     59.06%
  6760 requests in 1.00m, 36.84MB read
Requests/sec:    112.55
Transfer/sec:    628.01KB
➜  ~  wrk -t25 -c25 -d60s "https://www.trophus.com"
Running 1m test @ https://www.trophus.com
  25 threads and 25 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   216.06ms   32.34ms 454.83ms   74.22%
    Req/Sec     4.20      1.18     8.00     64.00%
  6973 requests in 1.00m, 38.00MB read
Requests/sec:    116.06
Transfer/sec:    647.62KB

Connecting nodes in Elixir

This post only really covers me on a local network, but the same principles apply to remote network connections, just make sure to use the externally facing IP address. All communications are unencrypted by default, and in a future post I may cover how to encrypt communications.

The two computers used were: a MacBook Pro (Early 2011) running 10.10.3 with a Core i5 dual core processor and an Acer Aspire V3 running Windows 8.1 with a Core i7. Both have 8 GB of memory.

the Windows computer has Erlang/OTP 17.4 and the Mac has Erlang/OTP 18, and both have the master branch builds of Elixir as of today.


Open up iTerm on Mac (or Terminal if you’re a fake programmer)
cmd for Windows (yeah yeah I have cygwin but I like to make my life more difficult).

run iex on both.

Find your ip address (you can use the private ip address on your local network)

On the Mac I typed the following:

➜  ~  iex --name mac@ --cookie monsta
Interactive Elixir (1.1.0-dev) - press Ctrl+C to exit (type h() ENTER for help)

iex(mac@> :global.register_name :mac, :erlang.group_leader

On the Acer I typed the following:

C:\Users\arthur>iex --name win@ --cookie monsta
Interactive Elixir (1.1.0-dev) - press Ctrl+C to exit (type h() ENTER for help)

iex(win@> :global.register_name :win, :erlang.group_leader

Both should output :yes (this is an atom if you are unfamiliar with Elixir. In Erlang this atom would be represented as yes but Elixir has Ruby-like syntax with some important differences and they are sort of like symbols).

Then, you need to connect both running iex processes (they are Nodes) together.

On the Mac I typed the following:

iex(mac@> Node.connect :"win@"
==> true

and on the Acer I typed the following:

iex(win@> Node.connect :"mac@"
==> true

Both of these commands on their respective shells should return true. If one or the other fails it is probably because the destination Node that you’re trying to connect to was started up (i.e., remember the iex --name x@y command?) with a local IP and not the external IP. For some odd reason I had this problem as I was writing this quick tutorial even though both machines are on the same network.

Last step before the fun!

You have to grab the process identifier of the other Node.

On OS X, I had to run this command:

iex(mac@> win = :global.whereis_name :win
==> #PID<8931.9.0>
iex(win@> mac = :global.whereis_name :mac
==> #PID<9131.9.0>

So you could do some pretty magical stuff at this point, but just as a tease, try this:

mac |> IO.puts "Hey Mac, from Windows"
win |> IO.puts "Hey Win, from Mac"

This should cause your strings to print out in the other machine’s iex session! Cool!!!

This also worked on a DigitalOcean instance I spun up from both of my machines. Awesome stuff.

Feel free to shoot me any questions if you have any trouble. I’m @arthurcolle on the twitters and you can find me on freenode on #elixir-lang and #erlang.

Thanks for reading!