Skip to content

Marcus Graichen: Looking forward with AI

Kickstart Your Online Business With These 300+ Video Tutorials

Everyone is talking about AI, but what does the future hold? In a series of interviews we examine unexpected perspectives on the space.

Marcus Graichen, developer and product architect, believes that decentralized and open source AI tools could become seriously competitive with ChatGPT and other commercial large language models. Graichen educates a worldwide audience on the use of Bittensor, a decentralized AI network that incentivizes participants to train and operate more than 4,000 machine-learning models in a distributed manner.

Bittensor is like Bitcoin in many ways. It has a transferrable and censorship resistant token, TAO, which runs on a 24/7 decentralized blockchain substrate which is auditable and transparent. Bittensor is also run by miners, like Bitcoin, who can exist globally and anonymously.

Bittensor Paradigm

Yes, it’s complicated. But if you remember BitTorrent and downloads (Graichen explains this later), you’ll have a sense of what’s going on here.

Grappling with Bittensor

Q: How did you get involved with Bittensor?

A: I come from a media, film and imaging background 25 years ago. I taught at university for about five years, then web development working with a lot of companies across the spectrum. I’ve been doing that for about 20 years. I became interested in cryptocurrency around 2017; it piqued my interest from a technical point of view because it changed a lot of industries I had been involved in. Being a developer within the crypto-sphere didn’t interest me; it seemed a bit soulless. GPT started to grow to prominence and I realized it was an incredibly powerful tool.

Bittensor was put up on my radar as an investment opportunity. I understood what they were doing and could see the flaws in GPT as a centralized language model, so I just kept following it. I now don’t consider Bittensor to be a crypto product. It has a token because, in order to decentralize incentivization, it’s actually an occasion where a blockchain is a perfect fit.

Q: And this led to you founding taostats?

A: Yes. It was born from keeping a lot of spreadsheets and passing them around within a small community. Anyone who uses Bittensor uses taostats [from the website: Bittensor metagraph statistics, data analysis, price history, staking, resources, rankings and information.] This gave me a prominent position as someone willing to communicate, not from a machine learning background, not a traditional Python programmer, so I guess I had an accessible persona to explain things to people.

Q: Then came BitAPAI?

A: Yes, and that was a terrible name because no one could say it. “BitaPAPI?” So we rebranded to Corcel. We are attempting to build products across the board and provide developers with access to them and build our own sub-nets; a bit of everything.

Q: Does anyone own Bittensor?

A: No. It’s a multi-layered protocol where the blockchain layer is solely the incentivization structure. It records the scoring assigned to miners by validators on which incentivization is given. Let’s say I have a room full of 100 university professors providing my intelligence. I go in as a trusted validator with a set of questions by which to score them. The scores I give them are posted on the blockchain, transparent and open, and based on that they are then rewarded by the network.

Bittensor was created by the Opentensor Foundation, a non-profit. There was no fundraising. They wrote this protocol, put together this consensus, released a white paper and threw it out into the world. The Foundation are now trying to step back although they do still control the fundamental levers of the network. It is essentially open source technology.

Where generative AI comes into the picture

Q: Can you start to unpack the differences between something like ChatGPT and the genAI products associated with Bittensor.

A: Bittensor itself is not just a language model. There are two sub-nets at the moment that do produce text responses and they work slightly differently. One allows the miners to run any model they can get their hands on — open source, APIs going straight into ChatGPT or anything else commercially available. They could be running secret language models they have access to, anything they went. Miners are rewarded for correct yet diverse responses. The other text-based sub-net, Cortex.t (which is what Corcel’s chat runs on) focuses on speed, accuracy and not diversity but similarity — if you ask everyone a question and the majority give you the same answer, the chances are that that’s going to be the valid response.

The simple way to explain it is, rather than asking a single source to perform a task — return a response or return an image — you are asking a distributed network to all do the same thing. And depending on the reward model for that network, you decide who provides the best value; you are scoring and marking them.

Remember BitTorrent? Torrents were a turning point for downloads because when we wanted to download a movie or piece of software, rather than having to connect to a single file sharing site that had one copy of that and a single person hosting it — and as it became popular it became very slow — that file was distributed. Everybody who had it could return it and those that returned it more were valued higher by the network. BitTorrent wasn’t on blockchain, there was no incentivization, but when I was young it was a cool, geeky thing to have a good ratio in your BitTorrent LimeWire, which meant you gave more than you received. A badge of honor. What Bittensor does is reward for performance.

Dig deeper: Decoding generative AI: How to build a basic genAI strategy for your marketing organization

Q: So rather than using one large language model, or two or three, you get access to many. And the best get rewarded.

A: Correct. But the difficult thing with language models is that it’s hard to say what is and isn’t a good response. If I ask you nine times nine, that’s very simple; but if I am rewarding a language model for diversity it might decide to be more astute and tell me a little bit about the history of multiplication. That might not be what you want.

Q: Tell us more about what you’re doing with Corcel.

A: The original idea was to provide API access to any of the sub-nets on Bittensor that would allow it. The original things we had were a text-prompting chat UI and an image studio app. These were meant to be examples of what you could do, but people took them as products. The Corcel chat app is the same, if not better, than GPT for everything you need to do on a daily basis. We’re about to plug vision into this. You can show it a photo, say cut out the dog or replace it with a cat or look at the items in my fridge and give me a recipe.

The bottom line

Q: Why use Corcel chat over GPT?

A: When people ask me that, I say, would you pay $20 a month for GPT? They say yes. I say, you don’t have to pay for Corcel. It’s free. Currently, the incentivization of the network facilitates it being free and it probably will remain free for consumers because commercialization of revenue-generating apps will fund it.

Q: So that’s the bottom line? It works better and it’s free?

A: I’m not sure we’re going to go with that as the selling point, but that’s the truth of it. I am not going to take away that GPT is the industry leader in what they’re doing. The truth is that so many of the good open source models are trained on GPT.

Get MarTech! Daily. Free. In your inbox.

Source link

Back To Top

This site is protected by wp-copyrightpro.com