Given how similar the potential effects of AI are to the changes brought about by the Industrial Revolution I'm surprised more people aren't taking the chance to really think about it. The economic effects of most technological improvements follow a similar pattern, but the Industrial Revolution was a particularly big shift, as AI may be, making it a particularly attractive target.
Technologically, the story runs something like this: A lot of slow, somewhat skilled work got replaced by machines working quickly and some less skilled labour. Economically, productivity went way up, and overall people got more stuff cheaper, and it was all to the good. Before, most of the money went to the labourers. Afterwards, well, machines are expensive and unskilled labour is cheap, so the money mostly went to the owners of the machines. There was a shift where the returns to labour decreased and returns to capital increased. This may have caused some social problems for a while, but, y'know, all's well in the end, right?
The unspoken assumption about AI is that things should pan out in the same way. There's talk of "there will be winners and losers, but overall AI will benefit humankind", but no-one's so gauche as to be explicit that we're just assuming the winners will be big tech and the losers will be anyone creative. If we say that, someone might questions why that has to be the case, rather than just assume that things will play out like the industrial revolution.
There are a bunch of reasons why it shouldn't play out like that, and why we should choose to not let it play out like that. At the very start, the industrial revolution was not planned. We did not have the benefit of the hindsight of having done it before. We know what history looks like, and can choose to do something different. Choosing to not intervene is also not the natural order, it is definitely a choice.
So, why is this different from the industrial revolution? The before state is returns going to labour: People who write text and draw pictures get paid for their work. The assumed outcome is that in the future we'll be giving a smaller amount of money to big tech, and no money to creators. However, this is not economically driven, this is driven by regulatory structure and an assumption that history should repeat.
Returns to capital are usually driven by the fact that deploying capital is expensive. Machines are expensive. Companies that can spare the money to invest or have an intellectual property advantage use that to get returns on their capital. Except... AI is a mess commercially right now because no-one has a moat. No-one has built a huge advantage from their model. There are good open source models. Even training infrastructure isn't the huge advantage people had hoped, as people have found ways to make smaller models that are almost as effective, and adapt expensively-trained models cheaply. People wanted the value of AI to be in the technology and infrastructure, to enable returns to capital. It turns out that's not where the value is.
The value is in the training data: The creative output of real humans. The bit that's currently being valued at nothing in an AI world. The value of human labour is usually determined by supply and demand. In the industrial revolution, returns to labour went down because a smaller amount of less skilled work was required to produce the given output. In the AI scenario the price of the input data is zero not because that's the market price of producing that data, but because we've currently got a regulatory framework that just allows people to take it.
The business model of AI right now is largely attribution laundering. If I search for something on the web, I used to get a link to content made by people, they get their attribution, they find some way to monetise the provision of that information. The move has been to have search engines try to answer questions directly, grudgingly providing a link back to the original source and hoping you won't click through. The AI model is to put all the data into a big pot, stir it around, serve up an answer, and give zero credit to the people who contributed to the answer. The answer still comes from the source data, but all attribution, all sense of owing anything, has been wiped away.
This is both ethically and economically broken. Anyone who loves creating and sharing art, is surely not in favour of building a system that destroys the ability of people to make money from that art and otherwise disinventivises its production. Taking such art freely given and making it into a weapon against the gift-giver cannot have an ethical basis.
In some cases it might be argued that the T&Cs of services enabled this use in training data. This seems about as sound as trading Manhattan island for a pile of beads, but is clearly not the limit big tech desires. In this article Sam Altman claims that "material on the public web would be fair game". Interestingly the true value of this data is clear to him when he says "companies should have the right to say they do not want their data used for AI training" - the goal is to structure winners and losers on a regulatory basis, rather than by optimising ethical and economical goals.
This brings me on to the economic side of things. We build economic structures that incentivise the behaviour we want. For example, free markets might look like a self-organising structure, but they're built on property rights. Historically, we have wanted to encourage human creativity, and we ended up with copyright law. The current AI regime tries to end-run copyright. The only reason to give up on protections in the face of a much bigger threat to IP is if we no longer want to encourage human creativity.
AI depends on its training data. If we stop contributing human creativity, and populate the world with AI generated art and text, asymptotically we'll have AI-generated media based on AI-generated media. What that will look like is not clear, but it looks like a recipe for destroying originality, a key driver of change and growth.
Who knows, maybe I'm wrong, maybe human-based creativity is overrated, and not that valuable. From an economic standpoint, there should be a price, and it should be discoverable, and anyone wanting to work for that little value can feel free to. Forcing the value of human creativity to zero makes no sense either for the free market invisible-hand-ers, nor for the "build an economic system to enable the outcomes you want" crew. The only people it works for is those who want to take others' work for free, label it as their own, and sell it.
To summarise:
Posted 2023-05-25.