Summing up AI 2023

So the last 12 months have been amazing, if not rather dramatic with regards to AI. Things have improved a lot and we will see and hear more of this over 2024 I’m sure.

These are some of the things that I’ve found interesting…

We of course have had the big dwrama over at openAI, with Sam Altman being fired / Quitting? and then the board being fired and Sam getting his job back. Rolling stone has an interesting write up about this.

Everyone is wondering and predicting what this Q* (Pronounced Q star) product at open AI is – some think it may be an AGI (Artificial General Intelligence) but very few people have had access to this product so far. Although there is a lot of speculation.

We still don’t know what’s happened with googles LAMBDA and Blake Lemoine is still I think the canary in the coal mine with regards this technology.

The issue of building your own “Bad version of chatGPT” is a very real possibility. We should beware of Bad Robots! And I mean Bad Robots- you could theoretically hook your own bent version of chatGPT up to a mechanical device and let it lose(Who knows what the military are up to with this idea).

In addition to this is the issue of copyright and the fact that everyone is ignoring the importance of related links and knowledge that back up the statements made by AI (not to mention the issue of AI hallucinations) . Although it is possible for these platforms to supply and reference sources, most of the commercial products don’t include this functionality. This I think is going to pan out in interesting ways. Already a number of Authors are attempting to sue OpenAI.

But I think the most interesting thing you could do is build your own chatGPT and train it on your own data. I’ve set up something on an old machine I run my self and gave it a number of my old blog articles and various other bits and bobs to play with. The results were solid and interesting (with references!).

But two things come to mind with regards this. You need CPU and Ram and ideally a few GPU’s to run this sort of software. In short a grunty machine or an expensive virtual machine, that runs at a reasonable speed (although I did manage to get this running on a machine with 4 cores, 8 gig of ram – it was very slow) but the scenario that comes to mind is this.

If you have your own company and fast access to your own data (files, emails databases, financial data etc), and can hook it up to your machine, you could probably gain all sorts of interesting insight. What was the most profitable project? How many emails were sent? What were the time frames for this project. These and a whole lot more questions could be asked about your data. The stinger comes though when you get around to the speed of the computer running this and the connectivity of your expensive AI brain to the content.

If you only have a 100 megabit to all that data sitting in the cloud, it’s going to slow things down. If you have invested in local hardware (and say have 10 gigabit or more connectivity) to your data your going to get results much much more quickly. I see an argument for employing your own sysadmin percolating!

In short 2024 is going to be just as crazy as 2023 it’s sort of amazing to be alive and witnessing all this. Thanks for reading and stay safe over the holiday season!

Steve

 

Related links quoted! _____________________________

Blake Lemoine
https://www.newsweek.com/google-ai-blake-lemoine-bing-chatbot-sentient-1783340

WTF Is Happening at OpenAI?
https://www.rollingstone.com/culture/culture-news/sam-altman-fired-open-ai-timeline-1234889031/

This new AI is powerful and uncensored… Let’s run it
https://www.youtube.com/watch?v=GyllRd2E6fg

Authors sue OpenAI over ChatGPT copyright: could they win?
https://www.businessthink.unsw.edu.au/articles/authors-sue-openai-chatgpt-copyright

Let’s build GPT: from scratch, in code, spelled out.
https://www.youtube.com/watch?v=kCc8FmEb1nY

 

Ai for work and home

As AI is starting to become something many of us are using, it’s interesting to think about the possibility’s of getting an AI to look at our own data.  I’ve set up privateGPT on an older computer and given it data to “ingest”!

It does take some time and it needs a decent amount of ram and cpu grunt, but it does work surprisingly well even on under-powered machines. It also gives you links to source documents you may have got the device to ingest (unlike some products I might mention… cough!).

There is a part of me that thinks that this sort of product could be very useful for insite for a private company (or even an individual). Think about the possibilities of giving it access to all emails sent. What could it learn? All files on the server, who created the most files? Who was involved in which projects and what were the  skill sets of the people involved (past and present).

Lets also think about the possibility of an AI having access to production data and financials!? What could be gleaned from all that information. It’s been said that we will soon have personal versions of AI that can run on phones. I’m sort of looking forward to this … but having recently re watched the most excellent Ridley Scott movie movie Alien Covenant, I’m recommending that we proceed with caution!

 

AI secrete source!

A couple of the publicly accessible AI’s that I’ve been tinkering with will not quote sources and will not tell you much if anything with regards what they have been trained on. But there is now this very interesting development that the Guardian have reported on which is that Sarah Silverman  is current sueing OpenAI and Meta claiming AI training infringed copyright

https://www.theguardian.com/technology/2023/jul/10/sarah-silverman-sues-openai-meta-copyright-infringement

I’m going to be holding on to my pop corn real tight as this works it’s way thru the courts, and the big wigs in silicon valley work out what to do and how to do it. I told you 2023 was going to be a very interesting year for AI.

AI and 2023

So I think 2023 is going to be a very interesting year as far as AI is concerned. In one corner you have OpenAI with ChatGPT and a 10 Billion dollar investment from Microsoft. In the other corner you have google with LaMDA. Which will prevail is going to be very interesting. Although at the present time it looks as though Microsoft, if they don’t mess it up may be able to overtake google in the search field if they get this right (They are close to integrating their products with ChatGPT and they also own a big hunk of the OpenAI company).

Google on the other had may have a different problem in that Lambda may be to advanced already – they may have they created something that might be difficult to monetise.

I’ve been thinking more about the emergence of sentience, and conciseness.
There has been a lot of arguing about what and if a sentient being could be developed/ evolved in a computational environment. I’ve touched on the concept of the Chinese room in the past. But there are 2 things I find interesting.

That AI is based on similar structures to the human brain,  it can learn (although as far as we know it can only do repetitive things well, we are not sure about the process of it evolving … yet!).

But think about the process of learning, or perhaps the process of growing up as a child. Becoming a human, in some ways we our selves start as and get  past the state of the “Chinese room”.

Think of a child and how it learns. It all starts with people and basic communication – words like NO, MUMMA, DADDA, etc. In the beginning the child does not understand, and in fact just answers in a manner that it thinks may be correct – it starts by making noise and imitation but in time (and with feedback) that understanding grows.

Do you remember when you became aware of your self? Do you remember when you discovered what feelings were? Computational feedback loops are not uncommon. Could they evolve into structures similar to human emotion or something similar?