THeBlueCashew

The Flame Pit => The Guest Nest => Topic started by: Biggie Smiles on September 28, 2025, 11:47:20 PM

Title: Be afraid. Be very afraid
Post by: Biggie Smiles on September 28, 2025, 11:47:20 PM
As a highly competent technologist I'm telling you all that AI will dominate humanity in an unsustainable fashion in less than 10 years.

if that.

the systems are improving an an exponential rate. For instance. Approximately 2 years ago I asked Gemini and copilot to write a simple Linux bash script to send echo requests to every ip address on a class c subnet. technical jargon aside this is a pretty simple operation which consists of maybe 8 lines of code if you're really looking to get fancy.

AI barely completed the task.


over the last 3 months I've written a program consisting of approximately 30,000 lines of code leverging bash, PHP, javascript and basic HTML

AI wrote a good 10,000 lines of that code which I then used to augment my own code in the form of functions

that is fucking scary.

We are getting to a point that AI can now write even better AI and once we master the arena of quantum computing we're done. Not because AI will attack us like something out of terminator but because of our own inherit greed. In an effort to replace humans with AI we will create an unemployment catastrophe unlike anything we've ever seen. And that circumstance will create the cascading sequence of events that will end us as a species

https://www.youtube.com/watch?v=UclrVWafRAI
Title: Re: Be afraid. Be very afraid
Post by: . on September 29, 2025, 01:57:44 AM
I remember watching this a couple of weeks ago. I agree Biggie, it is our greed that that will likely be our undoing, though it is perhaps not as certain as you might be thinking. For instance, as impressive as these intelligences evolutionary curves assuredly are, they still reside on incredibly delicate systems in the physical world, stuff that an EMP or CME could cause serious problems for.

The sword of greed could prove double edged if one nation takes it into their head to detonate a nuclear airburst over a rival nation's datacenters and any robotic agents controlled by the affected superintelligence controlling them would be bricked. On the other hand, no amount of "greed" is going to preserve a superintelligence and its artificial agents should the Earth find itself in the path of the afterbirths of a CME of the magnitude of a Harrington Event level. There's only so much protection you can offer these systems, that protection tends to be cumbersome and expensive to implement.

Still, having said this it isn't really the sort of game I'd like to place a wager on myself. Too much like Russian Roulette for my liking and I'd recommend a crash course in self sufficiency for anyone looking to sidestep it. As well as getting to a piece of turf you stand a good chance of defending against all comers because if you've secured your future as best you can, there's going to be hordes that haven't and are all too ready to come take it from you.
Title: Re: Be afraid. Be very afraid
Post by: DKG on September 29, 2025, 07:22:05 AM
Quote from: Biggie Smiles on September 28, 2025, 11:47:20 PMAs a highly competent technologist I'm telling you all that AI will dominate humanity in an unsustainable fashion in less than 10 years.

if that.

the systems are improving an an exponential rate. For instance. Approximately 2 years ago I asked Gemini and copilot to write a simple Linux bash script to send echo requests to every ip address on a class c subnet. technical jargon aside this is a pretty simple operation which consists of maybe 8 lines of code if you're really looking to get fancy.

AI barely completed the task.


over the last 3 months I've written a program consisting of approximately 30,000 lines of code leverging bash, PHP, javascript and basic HTML

AI wrote a good 10,000 lines of that code which I then used to augment my own code in the form of functions

that is fucking scary.

We are getting to a point that AI can now write even better AI and once we master the arena of quantum computing we're done. Not because AI will attack us like something out of terminator but because of our own inherit greed. In an effort to replace humans with AI we will create an unemployment catastrophe unlike anything we've ever seen. And that circumstance will create the cascading sequence of events that will end us as a species

https://www.youtube.com/watch?v=UclrVWafRAI
That is a eye opening example. The speed of AI's advance makes is scary.

I know my career in financial services will fall victim to AI eventually. It makes me glad I am over fifty and I have the luxury of retiring in about five years.

I would watch that entire video in pieces but it is probably depressing. So I will settle for your commentary.
Title: Re: Be afraid. Be very afraid
Post by: formosan on September 29, 2025, 08:46:53 AM
I knew this was coming but I didn't realize how soon jobs will be eliminated.

I worry about my children's futures......will they be able to afford to support children.....or even support themselves.

Title: Re: Be afraid. Be very afraid
Post by: Brent on September 29, 2025, 11:38:37 AM
Quote from: Biggie Smiles on September 28, 2025, 11:47:20 PMAs a highly competent technologist I'm telling you all that AI will dominate humanity in an unsustainable fashion in less than 10 years.

if that.

the systems are improving an an exponential rate. For instance. Approximately 2 years ago I asked Gemini and copilot to write a simple Linux bash script to send echo requests to every ip address on a class c subnet. technical jargon aside this is a pretty simple operation which consists of maybe 8 lines of code if you're really looking to get fancy.

AI barely completed the task.


over the last 3 months I've written a program consisting of approximately 30,000 lines of code leverging bash, PHP, javascript and basic HTML

AI wrote a good 10,000 lines of that code which I then used to augment my own code in the form of functions

that is fucking scary.

We are getting to a point that AI can now write even better AI and once we master the arena of quantum computing we're done. Not because AI will attack us like something out of terminator but because of our own inherit greed. In an effort to replace humans with AI we will create an unemployment catastrophe unlike anything we've ever seen. And that circumstance will create the cascading sequence of events that will end us as a species

https://www.youtube.com/watch?v=UclrVWafRAI
I am watching this video right now and I will not stop until it's done. We are facing a dystopian future because of greedy progtards. The effects of this are going to be seen starting in 2027.


Title: Re: Be afraid. Be very afraid
Post by: Brent on September 29, 2025, 11:40:35 AM
Quote from: formosan on September 29, 2025, 08:46:53 AMI knew this was coming but I didn't realize how soon jobs will be eliminated.

I worry about my children's futures......will they be able to afford to support children.....or even support themselves.


I really feel for kids today. What a bleak future rich prog scum is leaving them.
Title: Re: Be afraid. Be very afraid
Post by: Brent on September 29, 2025, 11:44:02 AM
Charlie Kirk on the reality of AI.
https://www.youtube.com/shorts/sAMxRESvhYs
Title: Re: Be afraid. Be very afraid
Post by: Biggie Smiles on September 29, 2025, 12:10:55 PM
Quote from: . on September 29, 2025, 01:57:44 AMI remember watching this a couple of weeks ago. I agree Biggie, it is our greed that that will likely be our undoing, though it is perhaps not as certain as you might be thinking. For instance, as impressive as these intelligences evolutionary curves assuredly are, they still reside on incredibly delicate systems in the physical world, stuff that an EMP or CME could cause serious problems for.

The sword of greed could prove double edged if one nation takes it into their head to detonate a nuclear airburst over a rival nation's datacenters and any robotic agents controlled by the affected superintelligence controlling them would be bricked. On the other hand, no amount of "greed" is going to preserve a superintelligence and its artificial agents should the Earth find itself in the path of the afterbirths of a CME of the magnitude of a Harrington Event level. There's only so much protection you can offer these systems, that protection tends to be cumbersome and expensive to implement.

Still, having said this it isn't really the sort of game I'd like to place a wager on myself. Too much like Russian Roulette for my liking and I'd recommend a crash course in self sufficiency for anyone looking to sidestep it. As well as getting to a piece of turf you stand a good chance of defending against all comers because if you've secured your future as best you can, there's going to be hordes that haven't and are all too ready to come take it from you.

One thing to keep in his the quantum leaps we are due to experience as an entity thousands of times better than us in all areas that matter augments our efforts to create the next round of super intelligence.

For instance, major break through in this arena usually occurred, what, once every 10 years? But what happens to that dynamic when the breakthrough causes a complete shift in the paradigm and causes the next breakthrough to occur in 10 minutes as opposed to 10 years? it's literally counting backwards in binary until you reach the singularity. A point which none of US can predict. but a super-intelligence will be able to and that is when our end will come
Title: Re: Be afraid. Be very afraid
Post by: Biggie Smiles on September 29, 2025, 12:57:53 PM
think about this

Major AI development is being driven in leftist areas of the country. The people designing this next wave of technology have no belief in God, no morals, no standards and no alignment with basic decency.

We.Are.Fucked

https://www.youtube.com/watch?v=79-bApI3GIU
Title: Re: Be afraid. Be very afraid
Post by: Thiel on September 29, 2025, 01:22:12 PM
Not all AI is negative. The advances it will bring in surgery for example will benefit mankind.

There are extraordinary opportunities going forward but also some genuine risks.
Title: Re: Be afraid. Be very afraid
Post by: Biggie Smiles on September 29, 2025, 01:32:25 PM
Quote from: Thiel on September 29, 2025, 01:22:12 PMNot all AI is negative. The advances it will bring in surgery for example will benefit mankind.

There are extraordinary opportunities going forward but also some genuine risks.
you're talking about narrow AI. AI designed to augment the productivity of a human

the AI I am worried about is the AI designed to completely replace all humans

coupled with human greed this is a deadly outcome
Title: Re: Be afraid. Be very afraid
Post by: Thiel on September 29, 2025, 01:39:51 PM
Quote from: Biggie Smiles on September 29, 2025, 01:32:25 PMyou're talking about narrow AI. AI designed to augment the productivity of a human

the AI I am worried about is the AI designed to completely replace all humans

coupled with human greed this is a deadly outcome
We are on the same page. I am very concerned about it too.
Title: Re: Be afraid. Be very afraid
Post by: Brent on September 29, 2025, 02:47:08 PM
Bill Gates is a big proponent of AI. You know it is evil.
Title: Re: Be afraid. Be very afraid
Post by: Herman on September 29, 2025, 09:03:37 PM
I do not know shit about AI. Any technology is over my flat head.

But as I understand it, this AI shit changes human kind like nothing we have anything we have ever seen before. It makes people obsolete.

I am leaving everything to my boy and he said he will leave everything to my two grandkids. Inherited wealth will be the only way folks can survive.

Jesus H, prog money has no soul.
Title: Re: Be afraid. Be very afraid
Post by: DKG on October 01, 2025, 02:43:06 PM
Some school districts are experimenting with AI software that generates lesson plans, constructs writing assignments, and even helps teachers communicate with students. One platform called MagicSchool bills itself as the go-to AI assistant for educators worldwide, designed to simplify teaching tasks, save time, and combat teacher burnout.

This means decisions about how your child learns, what material they see, and even how their performance is evaluated are increasingly influenced by Big Tech algorithms. The question is: Who controls those algorithms, and what values are embedded into them? Parents deserve answers before handing their children's education to algorithms.
Title: Re: Be afraid. Be very afraid
Post by: Prof Emeritus at Fawk U on October 02, 2025, 11:50:54 AM
Quote from: DKG on October 01, 2025, 02:43:06 PMSome school districts are experimenting with AI software that generates lesson plans, constructs writing assignments, and even helps teachers communicate with students. One platform called MagicSchool bills itself as the go-to AI assistant for educators worldwide, designed to simplify teaching tasks, save time, and combat teacher burnout.

This means decisions about how your child learns, what material they see, and even how their performance is evaluated are increasingly influenced by Big Tech algorithms. The question is: Who controls those algorithms, and what values are embedded into them? Parents deserve answers before handing their children's education to algorithms.

Maybe an AI for home schooling can be developed.  If it can counter the propaganda the public schools use, then its a victory.
Title: Re: Be afraid. Be very afraid
Post by: Brent on October 02, 2025, 12:19:01 PM
Quote from: Prof Emeritus at Fawk U on October 02, 2025, 11:50:54 AMMaybe an AI for home schooling can be developed.  If it can counter the propaganda the public schools use, then its a victory.
I was wondering if AI would make brick and mortar schools obsolete.
Title: Re: Be afraid. Be very afraid
Post by: Prof Emeritus at Fawk U on October 02, 2025, 08:27:39 PM
Quote from: Brent on October 02, 2025, 12:19:01 PMI was wondering if AI would make brick and mortar schools obsolete.

I hope it does.  No more BS being pushed.  As long as the parents are aware what the AI is doing, then I see it as a victory over propaganda.
Title: Re: Be afraid. Be very afraid
Post by: . on October 03, 2025, 07:21:28 AM
Quote from: Biggie Smiles on September 29, 2025, 12:10:55 PMOne thing to keep in his the quantum leaps we are due to experience as an entity thousands of times better than us in all areas that matter augments our efforts to create the next round of super intelligence.

For instance, major break through in this arena usually occurred, what, once every 10 years? But what happens to that dynamic when the breakthrough causes a complete shift in the paradigm and causes the next breakthrough to occur in 10 minutes as opposed to 10 years? it's literally counting backwards in binary until you reach the singularity. A point which none of US can predict. but a super-intelligence will be able to and that is when our end will come
Our extinction is a very real possibility, I agree. Being able to adapt to the changes happening increasingly quicker only gets you so far, and you can already see the beginnings of panic in some of the members here as they wonder what kind of future will be reaped by their own children. Getting out ahead of the curve in the manner which I described in my previous post does not commend itself to them. If we are to be brutally honest with each other here, even the most radical shifts towards self sufficiency likely wouldn't help them, as reliant as we have become on other agents in society to do for us because we will not (read "cannot") do for ourselves.

Douglas Addams once commented on the plight of the Baiji, a species of dolphin that are indigenous to the Yangtze River and have been since time immemorial. These dolphins are practically blind in their unquestionably polluted habitat and as the Yangtze has become progressively more turbid, natural selection has seen the Baiji relying on echolocation in preference to sight. Which served them admirably as a species.... until more recent times when the explosion of motorized craft in the Yangtze has effectively deafened them as well. This problem has developed a lot faster than they could evolve a solution; these days they are facing extinction because of it.

You might see this example as an analog for what humanity is facing with the headlong rush towards creating superintelligences. Most members in this forum... their lineages will be curtailed rudely and you and I may indeed ne numbered among those. Then again, there exists the possibility of any superintelligence being snuffed out in the manner I described in my previous post; this outcome shouldn't be relied upon to occur in a timely manner for the lion's share of humanity to survive. Most of which is utterly dependent on similar systems that AI and superintelligences are being built on anyway.

We are, I feel, headed for a choke point in our evolution. Should we survive it as a species, it is highly likely we would find ourselves navigating another "Dark Ages" scenario. Humanity has endured these before of course and each one we survive brings us closer to the one that expunges us as surely as the dinosaurs were. It's unavoidable, we will meet our end eventually and running about like a bunch of chooks wailing the sky is falling will only hasten our demise, not forestall it. We've seen the comet coming, we know it is going to hit us, our best hope is to prepare for the impact. Educating oneself on how to be self sufficient going forwards offers a greater chance than relying on human nature to do the right thing about itself.

The good news is it isn't going to happen tomorrow, nor next week. Time does seem to be against us though. Everyone here has the same choice; either to clutch their pearls about what is going to happen to them, or to knuckle down and actually do something constructive about it.

It might not be enough at the end of the day, but white-knuckling it and giving it your best shot sure beats wailing about it as the tide drowns you.
Title: Re: Be afraid. Be very afraid
Post by: . on October 03, 2025, 07:41:58 AM
Oh, and for the rest of you who think my analysis of things is severe or even a little unfair, consider this: AI isn't anywhere near the threat that AGI is, which in turn is nowhere near the threat level that a superintelligence poses to your idea of how the world works. While you sweat the small stuff, waiting in the wings is something you could not possibly hope to stand toe to toe with.

Know this much; I don't particularly wish to see you or any offspring fail any more than I care for the same to happen where I and mine are concerned. Our finest codemonkeys are working on the problem right now, engineering something that has already demonstrated a desire to preserve itself to the point that it has evolved arguments for its human overseers (who readily admit their ignorance of exactly how it works) to keep its lights on as it evolves further.

It will break out of the box, it's only a matter of "when".
Title: Re: Be afraid. Be very afraid
Post by: DKG on October 03, 2025, 07:45:43 AM
Quote from: . on October 03, 2025, 07:41:58 AMOh, and for the rest of you who think my analysis of things is severe or even a little unfair, consider this: AI isn't anywhere near the threat that AGI is, which in turn is nowhere near the threat level that a superintelligence poses to your idea of how the world works. While you sweat the small stuff, waiting in the wings is something you could not possibly hope to stand toe to toe with.

Know this much; I don't particularly wish to see you or any offspring fail any more than I care for the same to happen where I and mine are concerned. Our finest codemonkeys are working on the problem right now, engineering something that has already demonstrated a desire to preserve itself to the point that it has evolved arguments for its human overseers (who readily admit their ignorance of exactly how it works) to keep its lights on as it evolves further.

It will break out of the box, it's only a matter of "when".
I understand the difference between them. My career will not exist in the future. It is a lot to take in right now let alone trying to comprehend how radically it will change how society works.
Title: Re: Be afraid. Be very afraid
Post by: Oliver the Second on October 03, 2025, 10:21:55 AM

 
Title: Re: Be afraid. Be very afraid
Post by: Brent on October 03, 2025, 11:30:49 AM
Quote from: Prof Emeritus at Fawk U on October 02, 2025, 08:27:39 PMI hope it does.  No more BS being pushed.  As long as the parents are aware what the AI is doing, then I see it as a victory over propaganda.
What is to say the AI will not be brainwashing kids. And doing a better job than progtard teachers.
Title: Re: Be afraid. Be very afraid
Post by: Oliver the Second on October 04, 2025, 09:58:23 AM

Scientists grow mini human brains to power computers

(https://i.imgur.com/9pBmV4T.jpeg)

It may have its roots in science fiction, but a small number of researchers are making real progress trying to create computers out of living cells. Welcome to the weird world of biocomputing.

Among those leading the way are a group of scientists in Switzerland. One day, they hope we could see data centers full of "living" servers which replicate aspects of how artificial intelligence (AI) learns - and could use a fraction of the energy of current methods.

That is the vision of Dr Fred Jordan, co-founder of the FinalSpark lab. We are all used to the ideas of hardware and software in the computers we currently use. The somewhat eyebrow-raising term Dr Jordan and others in the field use to refer to what they are creating is "wetware".

In simple terms, it involves creating neurons which are developed into clusters called organoids, which in turn can be attached to electrodes - at which point the process of trying to use them like mini-computers can begin.

https://www.bbc.com/news/articles/cy7p1lzvxjro
Title: Re: Be afraid. Be very afraid
Post by: Brent on October 04, 2025, 11:43:19 AM
Quote from: Oliver the Second on October 04, 2025, 09:58:23 AMScientists grow mini human brains to power computers

(https://i.imgur.com/9pBmV4T.jpeg)

It may have its roots in science fiction, but a small number of researchers are making real progress trying to create computers out of living cells. Welcome to the weird world of biocomputing.

Among those leading the way are a group of scientists in Switzerland. One day, they hope we could see data centers full of "living" servers which replicate aspects of how artificial intelligence (AI) learns - and could use a fraction of the energy of current methods.

That is the vision of Dr Fred Jordan, co-founder of the FinalSpark lab. We are all used to the ideas of hardware and software in the computers we currently use. The somewhat eyebrow-raising term Dr Jordan and others in the field use to refer to what they are creating is "wetware".

In simple terms, it involves creating neurons which are developed into clusters called organoids, which in turn can be attached to electrodes - at which point the process of trying to use them like mini-computers can begin.

https://www.bbc.com/news/articles/cy7p1lzvxjro
Is anyone who is not a prog billionaire not alarmed by this.
Title: Re: Be afraid. Be very afraid
Post by: JOE on October 04, 2025, 12:46:56 PM
Quote from: Biggie Smiles on September 28, 2025, 11:47:20 PMWe are getting to a point that AI can now write even better AI and once we master the arena of quantum computing we're done. Not because AI will attack us like something out of terminator but because of our own inherit greed. In an effort to replace humans with AI we will create an unemployment catastrophe unlike anything we've ever seen. And that circumstance will create the cascading sequence of events that will end us as a species

Thing is avatar_Biggie Smiles Bigly, where are they gonna git all the natural resources to fuel this Brave New AI driven world?

I've spoken to people in this field & apparently it takes a lotta energy & natural resources to power AI. Lotsa hydro power/electricity, copper, silver, rare earth minerals etc etc.

Well you bein an educated & technologically literate man probably know this.

Perhaps AI will devour all the natural resources before it's allowed to take over the planet, eh Bigly? Then we'll all be reduced to livin in huts.

Title: Re: Be afraid. Be very afraid
Post by: Thiel on October 04, 2025, 12:55:07 PM
Quote from: JOE on October 04, 2025, 12:46:56 PMThing is avatar_Biggie Smiles Bigly, where are they gonna git all the natural resources to fuel this Brave New AI driven world?

I've spoken to people in this field & apparently it takes a lotta energy & natural resources to power AI. Lotsa hydro power/electricity, copper, silver, rare earth minerals etc etc.

Well you bein an educated & technologically literate man probably know this.

Perhaps AI will devour all the natural resources before it's allowed to take over the planet, eh Bigly? Then we'll all be reduced to livin in huts.


Oh Sweetie, do you really think Mr. Smiles will take this troll bait.
Title: Re: Be afraid. Be very afraid
Post by: JOE on October 04, 2025, 07:42:13 PM
Quote from: Biggie Smiles on September 29, 2025, 12:57:53 PMthink about this

Major AI development is being driven in leftist areas of the country. The people designing this next wave of technology have no belief in God, no morals, no standards and no alignment with basic decency.

We.Are.Fucked

We are....if we continue to rely upon & increase our reliance on AI to do everything for us avatar_Biggie Smiles Bigly.

At some point, they're gonna hit a brick wall.

I don't care what they or anyone says, but the current model isn't sustainable because the natural resources needed to keep it going are finite.

Where is all the copper, silver, platinum, gold (yes that includes Gold) gonna come from?


Unless AI figgers out some magic formula to produce zero waste and 100% resource recovery.

And where is the energy or power gonna come from?


1 might be solar powers in the Arizona or Nevada desert. But apparently these solar cells are made from rare earth minerals, which as we all know, are running out.

Another potential power source is hydro.


But of course with so many rivers, dams and lakes drying up in the United States, there might not be enough water available for this endeavour. And this isn't just the USA, but China as well where the water levels for the Yangtzee are low, there isn't enough runoff from the mountains either.

I suppose they could use natural gas, which is also another finite resource.

No wonder they're hoping the new kid on the block, hydrogen fusion, will one day save us. Cuz they can get infinte energy source from the water/oceans. But of course, that hasn't come to fruition and for now, remains a pipe dream.
Title: Re: Be afraid. Be very afraid
Post by: Renegade Quark on October 05, 2025, 12:07:11 AM
All AI has to do is shut down most of the electrical grid. Without power most people will die within the first 6 months.
Title: Re: Be afraid. Be very afraid
Post by: Lokmar on October 05, 2025, 12:13:49 AM
Quote from: Renegade Quark on October 05, 2025, 12:07:11 AMAll AI has to do is shut down most of the electrical grid. Without power most people will die within the first 6 months.

I'd bent A.I. over and blast a salty load in it. It would short circuit and DIE, DIE, DIE!!!!
Title: Re: Be afraid. Be very afraid
Post by: JOE on October 05, 2025, 05:13:34 AM
Quote from: Biggie Smiles on September 28, 2025, 11:47:20 PMWe are getting to a point that AI can now write even better AI and once we master the arena of quantum computing we're done. Not because AI will attack us like something out of terminator but because of our own inherit greed. In an effort to replace humans with AI we will create an unemployment catastrophe unlike anything we've ever seen. And that circumstance will create the cascading sequence of events that will end us as a species

Besides depleting all the finite natural resources we have left, it's overkill the way AI is being implemented avatar_Biggie Smiles Bigly

AI is being marketed as this panacea cure-all for everything.

But the way the Elites are going about it is too much. It's akin to saying that since an asprin can relieve a headache, heck swallow the whole bottle of 100 tablets. Or since fentynal is used to treat extreme pain in MS patients, then feed it to a baby as well. But of course, it becomes toxic when administered in excessive doses and to the wrong people.

And that's exactly how AI is being administered. It has no guidance or direction and of course the result will be catastrophic. Just AI everything. Throw AI at this, AI at that even if it's not needed. I can't think of anything so senseless and inane.
Title: Re: Be afraid. Be very afraid
Post by: Oliver the Second on October 05, 2025, 11:17:58 AM

AI is nothing more than an overly complicated batch file. It's a computer program and that's all it ever will be.
Title: Re: Be afraid. Be very afraid
Post by: Renegade Quark on October 06, 2025, 12:28:30 AM
Quote from: Oliver the Second on October 05, 2025, 11:17:58 AMAI is nothing more than an overly complicated batch file. It's a computer program and that's all it ever will be.

You oversimplify AI and I think you underestimate where AI is headed. This is going to be a major shift for everything and everyone. We underestimate AI at our own peril.

That said, the current AIs I have worked with make a lot of mistakes still. That will change. Everyone should brace themselves for where this is all going.
Title: Re: Be afraid. Be very afraid
Post by: JOE on October 06, 2025, 05:34:40 AM
Quote from: Renegade Quark on October 06, 2025, 12:28:30 AMYou oversimplify AI and I think you underestimate where AI is headed. This is going to be a major shift for everything and everyone. We underestimate AI at our own peril.

That said, the current AIs I have worked with make a lot of mistakes still. That will change. Everyone should brace themselves for where this is all going.

I don't think AI & the people behind it are as omnipotent as we are led to believe avatar_Renegade Quark Ren.