Jump to content

Half of Surly just became obsolete


Kel Varnsen

Recommended Posts

10 hours ago, Fudge Nuggets said:

AI ain't gonna drill no oil and gas wells by itself.  Bars and titters are safe.

I've actually helped a customer build a reinforcement learning environment that they used to teach an ML model to control a 5-axis arm with a grabber to stack segments of drill pipe on a rig, so never say never. It certainly isn't gonna happen in the next decade or two or three, but I'd bet something along those lines will be developed to reduce labor costs. Hell, most manufacturing and chemical plants have significantly reduced headcount with automation alone - no AI required

Link to comment
Share on other sites

19 hours ago, Shaggy3.0 said:

But can it attend agile/scrum meetings, create a backlog, integrate data messaging patterns, assess architecture tradeoffs, negotiate vendor contracts, and navigate internal politics?  Let's see it do that, google boy : )

 

im trying to tell that when youre ready, you wont have to

morpheus-laurence-fishburne-matrix-4.jpg


 

 

Link to comment
Share on other sites

Just now, 52-80 said:

Meaning its now basically human. 

Congrats to AI for passing the turing test

Hahaha this is a good way to look at the garbage in: garbage out problem. These powerful conversational NLP models like chatGPT are trained on a broad corpus of documents and data, but that data is still created and maintained by humans. Where humans touch the process is generally where errors and mistakes are introduced, since we tend to bring our own biases and solutions to a problem based on our past experiences. The ML model will learn the confidence and convincing ability from the authors it's trained on, but it still fundamentally lacks a cohesive understanding of the entities it may specifically talk about in a response 

Link to comment
Share on other sites

31 minutes ago, Captainant said:

I've actually helped a customer build a reinforcement learning environment that they used to teach an ML model to control a 5-axis arm with a grabber to stack segments of drill pipe on a rig, so never say never. It certainly isn't gonna happen in the next decade or two or three, but I'd bet something along those lines will be developed to reduce labor costs. Hell, most manufacturing and chemical plants have significantly reduced headcount with automation alone - no AI required

O&G is one of the few industries where labor costs are fuck all compared to the rest of the cost structure.  Back in ancient times when I started out an offshore rig would have a headcount of 50-70, depending on what kind of rig it was.  There has been a ton of automation since then and headcounts today are basically the same.  Instead of a bunch of roughnecks and roustabouts they have been replaced with a bunch of technicians to keep all this automated machinery running.  If anything, labor costs have gone up even with fairly stable headcounts (even accounting for overall labor cost inflation). 

The one definite advantage is the number of people required to be in high risk / dangerous areas has been greatly reduced.  That's awesome.  I don't know if we'll ever completely eliminate high risk exposures though.  A few years ago a rig for another operator had a fatality when a floor hand got crushed to death by the automatic pipe racking system.

Link to comment
Share on other sites

19 hours ago, Shaggy3.0 said:

But can it attend agile/scrum meetings, create a backlog, integrate data messaging patterns, assess architecture tradeoffs, negotiate vendor contracts, and navigate internal politics?  Let's see it do that, google boy : )

 

lulz at the idea that chatgpt isn't better at internal politics than the average programmer

15 hours ago, Dahobbs said:

Like with a lot of things, it can do a really good job. It can also do a really good job of making it look like its done a good job, when in reality it doesn't work (although, arguably, the same is true for anything from stack exchange). So, definitely check that code before you implement it. And us lawyers definitely need to check those arguments and citations before relying on them. But overall, I agree. It does a really good job of giving you a starting out point or helping provide a general framework to work from. 

palabra, it seems like it is useful as a research tool but you would quickly be in a bad spot if you tried to have it build what you're trying to build.

Link to comment
Share on other sites

1 hour ago, 52-80 said:

Meaning its now basically human. 

Congrats to AI for passing the turing test

Can’t remember where I heard this joke, but it was something like: Since AIs are all being created by companies that are super sensitive to negative press, they’re constantly doing everything they can to ensure their AI doesn’t say anything dangerous or controversial. In that context, the Turing test should really be whether an AI will accurately tell you how to do something like secretly purchase the materials to build a nuke.

Link to comment
Share on other sites

54 minutes ago, SquishMitten said:

Can’t remember where I heard this joke, but it was something like: Since AIs are all being created by companies that are super sensitive to negative press, they’re constantly doing everything they can to ensure their AI doesn’t say anything dangerous or controversial. In that context, the Turing test should really be whether an AI will accurately tell you how to do something like secretly purchase the materials to build a nuke.

The Turing Test is an idea from 1950 to test a machine's ability to exhibit intelligent behavior - which includes a convincing recitation of facts, which is a strength of natural language models. In your theoretical test, chatGPT could do that if it's read a document that says how to do it, and it could even make it sound like a pirate wrote it or something silly like that. But it would not be able to take all these disparate datapoints and synthesize a new idea from them. NLP models can't really do that on their own.

 

That said - an NLP model IS an effective mechanism to take text and translate that into a machine-understandable intent, which can be used to do something else like create an image with Dalle-2 or Stablediffusion. The next step forward in complexity will be layering and combining and interconnecting ML models and different neural net structures into a more complex architecture - which is interestingly how our own brains work

  • Hook 'Em 1
Link to comment
Share on other sites

1 hour ago, Celery Man said:

lulz at the idea that chatgpt isn't better at internal politics than the average programmer

And lulz at the idea I listed skills for an average programmer.  No doubt, technology will replace entry-level code skills held by self-taught, code bootcamp grad, and average computer science students equipped with l33T frAm3werkz coding skills.  But that's about it.

And as for navigating internal politics--good luck with that.  Let's see if it can keep track of the names and badges for each C-suite mistress of the month.  Will it organize access badges for "Candi", "Bekka", or "Tamee"?  Don't think so.  And I doubt AI can drive inside baseball talk for those that matter.

 

Link to comment
Share on other sites

45 minutes ago, Captainant said:

The Turing Test is an idea from 1950 to test a machine's ability to exhibit intelligent behavior - which includes a convincing recitation of facts, which is a strength of natural language models. In your theoretical test, chatGPT could do that if it's read a document that says how to do it, and it could even make it sound like a pirate wrote it or something silly like that. But it would not be able to take all these disparate datapoints and synthesize a new idea from them. NLP models can't really do that on their own.

 

That said - an NLP model IS an effective mechanism to take text and translate that into a machine-understandable intent, which can be used to do something else like create an image with Dalle-2 or Stablediffusion. The next step forward in complexity will be layering and combining and interconnecting ML models and different neural net structures into a more complex architecture - which is interestingly how our own brains work

 

Link to comment
Share on other sites

The lawyers here will be ok.  Some dumbass humans are going to trust nascent AI too quickly with some catastrophic or near catastrophic consequences.  AI will become a bad word with an extremely negative connotation that throws its development and adoption back for decades.  Think along the lines of Chernobyl and 3 mile island and the effect it had on “nuclear power.”

The Surly lawyers, facing the uncertain fate of having to make an honest living, will suddenly find themselves in demand again to litigate the AI catastrophe.

  • Hook 'Em 1
Link to comment
Share on other sites

2 minutes ago, Goredho said:

The lawyers here will be ok.  Some dumbass humans are going to trust nascent AI too quickly with some catastrophic or near catastrophic consequences.  AI will become a bad word with an extremely negative connotation that throws its development and adoption back for decades.  Think along the lines of Chernobyl and 3 mile island and the effect it had on “nuclear power.”

The Surly lawyers, facing the uncertain fate of having to make an honest living, will suddenly find themselves in demand again to litigate the AI catastrophe.

Solid points.  The only way this falls apart is if there is enough of a lag between AI taking over and its inevitable failure.  I've never known a lawyer to quit the profession and return later.  The realization of just how shitty a job it is sets in quickly upon leaving, and after that there is no turning back.  It must be like taking your first breath of fresh air after having lived in a coal mine for thirty years.   You're not going back if you can help it.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.



×
×
  • Create New...