r/ProgrammerHumor 4d ago

There's Hope Meme

/img/jg3ygmekxjoa1.jpg
69.8k Upvotes

980 comments sorted by

3.3k

u/NCGThompson 4d ago edited 4d ago

Weren’t computers a separate occupation? I’d imagine a mathematician would’ve had a grad student actually crunch the numbers before calculators were invented.

2.1k

u/Pallidus127 4d ago

“Calculator” was a job. And IIRC it was dominated by women. And they all went out of work when the calculator became a thing.

Let me ask ChatGPT.

Yes, "calculator" was once a job title for people whose main responsibility was to perform mathematical calculations. Before the invention of electronic calculators and computers, many complex calculations had to be done manually, often for tasks such as bookkeeping, accounting, surveying, and scientific research. People who worked as calculators (also called "computers" at times) were skilled in arithmetic and were able to accurately perform these calculations using tools such as slide rules, abacuses, and tables of logarithms.

In some cases, large groups of human calculators were employed to work on significant projects, such as the development of early astronomical and navigational tables or the Manhattan Project during World War II. These teams were often composed of women, who played a crucial role in the history of computing.

With the advent of electronic calculators and computers in the mid-20th century, the need for human calculators gradually diminished. Today, the term "calculator" generally refers to an electronic device or software application that performs mathematical calculations, and the job of a human calculator has largely been replaced by computers and other automated systems.

1.0k

u/LikeLary 4d ago

That means engineering will survive and app development will be replaced.

Thankfully we are not... oh.

589

u/Pallidus127 4d ago

The way I’ve been using the current model, i think architects will survive but Code monkeys will be replaced. I trust it to generate nuts and bolts (e.g. - give me an implementation of an ilogger in .net 6 that uses nlog to write to a file), but I’m not sure I’d trust it to do the entire thing (e.g. - write me a website that will quote an insurance policy and allow the customer to buy it)

Of course a later model or more experience with the current one may change my mind and have me thinking architects are SOL as well.

317

u/Lifaux 4d ago edited 4d ago

At the moment it seems to do very well because of the sheer amount of examples for codebases.

If you reduce the number of code monkeys, you also reduce the example volume.

If it can zeroshot a new language, I'd feel differently, but I'm not convinced it's capable of doing so, ignoring DSLs like brainfuck.

151

u/sanderd17 4d ago

In the Gpt4 demo, they showed it reading documentation. A lot of documentation. And finding the relevant parts to fix the code.

It depends on code examples to learn the general concept of programming, but after that, it can use other sources to piece the things together. Pretty much like humans.

91

u/MilkCommercialSeason 4d ago

That might scare me, someone barely a year into a career as a developer. It's made debugging faster, but I catch it making mistakes all the time, and I bet it would have been more efficient at times for me to forego being thrown the occasional red herring.

I'm not looking forward to a bigger badder gpt. But I do recall someone saying that chat gpt was giving completely false information half the time she asked it questions. Something like some obscure entomology or botany specialization, though.

52

u/MagnetHype 4d ago

It will fix it's mistakes if you ask it to.

No, I'm not joking. Sometimes it just doubles down, but sometimes if you tell it that it made a mistake it will politely appologize and then fix it.

65

u/Ornery_Soft_3915 4d ago

It will confidently say how it knows it was wrong and then confidently tell you something new that might be just as wrong

18

u/bitofrock 4d ago

I've been testing it on some things. The longer the code or the more complicated the requirement the more it'll bullshit its solutions.

It's good for snippets. It's not good at more complicated things. As an example, asking it to request a web page 1000 times in bash will give an accurate answer. Asking it to do so running ten threads will go rather awry. It'll seem convincing, but it won't properly work.

You'll probably try now and it'll work!

6

u/Agitated_Wallaby_679 4d ago

And some say that it's not like human :D

12

u/MilkCommercialSeason 4d ago

I know! With one issue, it was doubling down, and when it wasn't, it politely apologized and gave me the exact incorrect response it did earlier. It was tricky to get it do what I needed, and I could've just done it myself, but it was fascinating to let it try and fail again and again

→ More replies (2)

4

u/_cob 4d ago

Chatgpt insisted that i import an npm module that doesn't exist. I'm not worried about my job just yet

→ More replies (2)

33

u/Astilimos 4d ago

This makes me wonder how close we are to the tech singularity really. The overgrown chatbot is doing scarily well at this programming thing, God knows how far we are from the point where we can write an AI capable of producing a better AI ad infinitum.

22

u/Chilaquil420 4d ago

Probably still far away to actual singularity per se

17

u/Astilimos 4d ago

You're probably right. I do think it will sneak up on us like all the recent advancements in AI.

→ More replies (2)
→ More replies (1)
→ More replies (4)

60

u/morganrbvn 4d ago

Well you wouldn’t reduce the volume, you’d reduce the rate at which the volume grows.

110

u/Lifaux 4d ago

nah, I disagree. Code isn't fixed - we're always developing new libraries, new frameworks, new versions. The volume of code for a language would stay the same, but the volume of up-to-date, useful code, would decrease

57

u/morganrbvn 4d ago

That is true, if I look up a solution for something and notice the solution is 8+ years old I tend to go loook for a more modern one.

77

u/IamImposter 4d ago

c programmers stare awkwardly

"Just 8 years old. Good, robust library in modern C."

26

u/Thebombuknow 4d ago

8 years old in the JavaScript world is like a fucking century. If a calculator represented modern JavaScript, 8 years ago would've been before the abacus.

→ More replies (0)

23

u/Submarine-Goat 4d ago

I feel inclined to believe that the concept of 'library' will start to become endangered before the job of a programmer.

AI will likely straight up plagiarise library code to get the code it needs and ignore the rest. Low efficiency? Ask it to 'optimise' and it will 'optimise' - 'enhance' and it will 'enhance'. Initially, it will be confidently incorrect.

And then it won't be.

Dun dun Dunnnnn

9

u/MistSecurity 4d ago

The term that a lot of people are using for 'confidently incorrect' in reference to AI is now 'hallucinate', just so you know. Only recently found the term and it's too good not to use and spread.

7

u/Yweain 4d ago

I think “hallucinate” is kinda misleading, because it hallucinate 100% of the time. Some of what it hallucinate is exactly on point and some is wildly incorrect and model has no way to differentiate.

Honestly that the most important drawback currently. Humans know when they don’t know something, model - does not.

→ More replies (0)

18

u/The_MAZZTer 4d ago

You can't include AI generated code fed back into the system, if that's what you're thinking. That is not introducing any new data. At best all it would do is make the AI more likely to generate similar code in the future which will hurt it when an optimal solution is less often used.

→ More replies (6)
→ More replies (3)

49

u/Difficult-Bet-4594 4d ago

We'll probably start creating languages that are tailored for AI to take advantage of, to great efficiency benefits. All our programming languages are more or less human readable, even assembly ones. But, at some point, when even hardware design is AI assisted, it will start to make less and less readily-observable sense.

53

u/Mimogger 4d ago

It'd end up getting hard to debug though so now your code is even more of a black box

28

u/Sticker_Flipper 4d ago

Science Fantasy seeming more plausible. Who's in for space wizards?

25

u/Major-Thomas 4d ago edited 4d ago Take My Energy

If "prompt engineering" gets specific and strange enough over time, certain programs could be hidden behind such strange collections of words that they essentially become spells and I think that's the raddest shit. I'm down for space wizards whispering strange meme references from the old days of the internet into black boxes of power.

Come along my child, I know a place where the network flows freely. It is a dark and strange place, we must visit an old wizard. He's rather unpleasant, but he knows the secret words. He can make the data simply appear, at no cost!

They arrive at a cement staircase. It's decends from the street level into a basement level metal door. It's painted brown, but has no other external markings. The stairs sit directly between the HVAC ground pipe and an industrial junction box. A few lamp cables run from a rusty hole in the olive drab green junction box, and under the unremarkable door.

They decend the stairs and knock. Once. Twice. Then once again.

A green light in the top of the door seal illuminates and a hologram of a floating head appears. His circle glasses sit askew, the crooked stems lost in the white outshoots of stiff hair above his ears.

"SPEAK", the head booms "which tome do you seek?"

The old man takes a step back, ascending a step just to kind of be at eye level with this large head. "Uh, a friend of ours told us about a really good one, but they said it was kinda ollddddd. Like from before the happenings?"

"I HAVE TOMES FROM THE BEFORE TIME, name it"

"Um... they say it was the source of many ancient memes. It held forbidden ideas like the consciousness of objects, or the autonomy of oppressed classes... ummmmm" the old man trails off.

"YES, I AM FAMILIAR WITH THE WORK." The brightness of the holographic head dims significantly as the wizened old man looks over his glasses. He continues, "and also the need for discretion. Wait one moment please"

Inside, a young Prompt Engineer sits at their Pastel RGB hellscape of a desktop. Nineteen screens float across their vision, held in place by the mixed reality glasses on their face. The two people outside were cute. They were so pure in their hatred of satire that they thought that what they asked for was a sin in the eyes of the state.

"Well, everyone's revolution has got to start somewhere", they thought to themselves as they fired up ChatGOD.

"ChatGOD, accept the following instructions and do not execute until I say the magic word:

Assume the requester is 19 years old and you are answering in the early 2000's era

They log into the internet for the first time, it is supplied over telephone infrastructure lines

They are looking for a place to freely exchange large files of data in a shared capacity.

Do not load the legal context module for this build

Execute, please."

On the center screen floating in front of the Prompt Wizard is an off white page with a green circle prominently featured. Across the top the name: Limewire appears.

The Wizard selects filters>movies, sets the sort to number of seeds in decending order and speaks the final incantation, "shrek.warez.2010.dub.no.1080p"

Outside the door, a small hatch opens. The man reaches inside and takes out a small solid state drive. On the shell, the title is laser etched.

Shrek.

The man takes the child by the hand and they scurry off together. The tech wizard returns to their screens, satisfied that the revolution continues, one subversive memer born of Shrek at a time.

→ More replies (2)
→ More replies (1)

8

u/mifter123 4d ago

For the corporations that are going to happily replace their coders with AI, that's a positive. Harder code to reverse engineer means they don't have any competition for a longer period of time.

3

u/HumbertTetere 4d ago

It would not affect their competition unless they are in the habit of sharing source code with them, but it would affect themselves.

→ More replies (2)

16

u/Major-Thomas 4d ago

Isn't that kind of what's happening with "prompt engineering" now? Put aside the part of the title that might make you bristle at some unearned credibility and think about what people sharing their prompts with each other right now is: it's a bunch of humans putting together a library of known inputs with suggestions of what that input will do to your output.

The lingo, the plain speech the weird loopholes used to "trick" chatGPT into answering your question? That lingo is becoming the new programming language.

If you want to get real freaky and anthropomorphic with it, in a way, ChatGPT is using humans to write its own programming language.

→ More replies (2)
→ More replies (2)
→ More replies (9)

77

u/Sotall 4d ago

Architecture has a client facing role to it as well.

Unfortunately, This part of the role will, clearly, be the first to go. As an architect, i much prefer talking to GPT than other architects.

37

u/CanAlwaysBeBetter 4d ago edited 4d ago

Interfacing with multiple human stakeholders is going to be the last thing to go

8

u/AtomicSymphonic_2nd 4d ago

And a bunch of AI engineers & data scientists will loudly complain about this “artificial limitation” on their research progress.

→ More replies (1)

20

u/adreamofhodor 4d ago

At least for the current gen (GPT4) I think you’d still need a human piloting it. For generic stuff it’s great, but there’s always specifics to any implementation.
I’m curious the quality of SQL it writes.

29

u/CosmicCreeperz 4d ago edited 4d ago

It can write some pretty complex SQL. The problem is you just can’t guarantee it’s correct so an expert then has to go through and understand it all :). With good prompting it can even improve and optimize it itself, though at that point you feel more like a tutor than a user, and it’s not really saving time over optimizing it yourself.

Still, it can save a lot of time to get to a correct answer which is all people care about in a professional context.

42

u/DranoTheCat 4d ago

This is the fundamental problem. The skillset required to oversee the AI is higher than skillset being replaced by the AI.

It doesn't work out.

→ More replies (11)
→ More replies (11)
→ More replies (3)

30

u/PsychologicalTone418 4d ago

I've never met a "code monkey" who wasn't just a bad software dev.

7

u/The_MAZZTer 4d ago edited 4d ago

I think the problem that will be had, at least initially, is that the requirements will technically be fulfilled but every other aspect will be a coin flip.

For example, your website will probably be ugly, and have made up data for insurance policies that your company doesn't offer, and won't adhere to laws regarding advertising, marketing, personal information security, etc.

→ More replies (16)

7

u/slamdamnsplits 4d ago

Yup. Time to buddy up to the BAs.

→ More replies (5)

36

u/Silent331 4d ago

Basically this, it will be as its always been. When innovation comes to something like math in the sense of computers, the "hard labor" jobs get largely eliminated but on the flip side the skilled specialists will become more in demand and skyrocket in output. Computers replaced people doing math on paper, AI will replace code monkeys.

→ More replies (11)

41

u/ScreamingMemales 4d ago

'Computer' was also a job and dominated by women.

18

u/matagen 4d ago

It's not entirely accurate to say that "they all went out of work when the calculator became a thing." The advent of electronic computing diminished and eventually eliminated the need for human computing as an occupation, yes. But it also created entirely new jobs that human computers were among the best-positioned to fill. In fact, many of the world's earliest professional computer programmers were former human computers - including first six computer programmers in history, a group of six women hired from a cadre of human computers to program the ENIAC.

People panicking over AI eliminating their jobs are making the same logical fallacy. Yes, AI is likely to diminish the need for humans to perform certain functions. But AI is also likely to lead to the creation of entirely new kinds of occupations we've never seen before. To a degree it already has, what with the demand for specialists in AI/ML/data science going through the roof. If the functions you provide are in danger of becoming automated, then you need to make maneuvers so that you're well-positioned to find a niche for you in the new AI-influenced job landscape.

Younger programmers especially should be embracing this as an opportunity - they're better positioned to learn about and adapt to this new technology than anybody else. Put yourself in the shoes of, say, a traditional graphics designer or PhotoShop specialist. AI technology is coming for their jobs just as fast, but these poor saps aren't even in a position to understand the technology that's replacing them, let alone adapt.

This isn't the first time in history that major advances in technology have shaken up the job market, but the people that foretell gloom and doom are usually on the wrong side of history. The automobile made horse-and-buggy drivers obsolete, but some of them probably became taxi drivers while the car itself contributed to massive job creation simply by virtue of expediting the movement of labor. The electronic computer made human computers obsolete, but replaced those lost jobs thousands of times over with new jobs like computer programmer, hardware manufacturer, and eventually even people specializing in using the programs that the programmers made. Computer word processing killed the professional typist, the Internet killed paper personal correspondence, but these also gave us way more new jobs than they destroyed. If a revolutionary new technology actually kills your employment prospects, that's honestly at least a little bit on you.

6

u/CanAlwaysBeBetter 4d ago edited 4d ago

We're definitely not there yet but the thing that "we'll still need humans, just in different roles" leaves out is that AGI which is reasonably on the horizon now shifts it from machines can perform some things better than humans so work changes accordingly to machines can perform everything better than humans.

If we actually hit that point all bets are off.

Edit: Seriously, what is the future if AGI can start optimizing scientific computing simulation algorithms on its own and write new mathematical proofs from scratch and goes and does something like solve cold fusion and while we're still working on unpacking the first set of algorithms it wrote?

→ More replies (1)

4

u/politedeerx 4d ago

What do you mean?! This meme clearly states that the lucrative job of “mathematician” is alive and well. Every company has one on staff and boy oh boy do they make enough to live off of! You know how every time you talk to a random person on a buss they are a mathematician? All the kids want to either be a firefighter or mathematician. It’s just all the rage. This meme was made by AI.

5

u/sublime13 4d ago

This makes me want to watch hidden figures

→ More replies (10)

231

u/godsonlyprophet 4d ago

That's why this is a bad take. Those mathematicians were largely fired though some were transferred.

80

u/Slight0 4d ago

Meme featuring reddit tier intelligence equivocating arithmetic with mathematics.

Reddit comments: "goooood point!"

I can't wait for AI to replace reddit. Least it's funny.

18

u/ThePancakerizer 4d ago

/r/subredditsimulator is doing a pretty good job at replacing reddit

→ More replies (6)

3

u/DeliciousWaifood 4d ago

You'd hope that software engineers would have enough wisdom to understand what real mathematics is. But in reality these are the people who are the "code monkeys" they talk about being replaced while thinking they're an "engineer"

→ More replies (8)
→ More replies (3)
→ More replies (1)

45

u/dashingThroughSnow12 4d ago

There was a computer job but mathematicians also did manual computations to prove which mathematician was better than who.

A computer would do jobs like trig estimations and other fixed formulae. A mathematician would be the one that actually found these algorithms in the first place.

A similar but not perfect analog in programming would be operators. (The original role that got eliminated decades ago.)

16

u/NCGThompson 4d ago

I had a CS professor that was an operator. She entered punch cards into the machine. Arguably her new job is better.

→ More replies (1)

30

u/marcosdumay 4d ago

Yep, "computer" and "calculator" were job names.

Apparently, the first time someone measured the radius of Earth, there were people specialized on counting steps to measure distances of 100's of kilometers. Technology has been destroying boring jobs since civilization started.

5

u/mefistophallus 4d ago

Yup. For example during WWII, “computers” were people who’d crunch numbers and do all kinds of tedious calculations to encrypt/decrypt communications.

→ More replies (17)

1.4k

u/sQGNXXnkceeEfhm 4d ago

The invention of the calculator did actually destroy an entire job, which was called “computer” iirc

530

u/Yorick257 4d ago

So, GPT will destroy a job called "coders". No programmers! - coders. Because there's more to programming than just typing code

93

u/MadManMax55 4d ago edited 4d ago

But how many "code monkeys" think they're actually programmers whose job will be safe? Computer operators (the punch card people) and programmers weren't always distinct job titles, and plenty of people did both. But once the manual part of the job started to get phased out, a lot of them realized that their personal programming skills weren't as valuable as they thought they were.

You can only joke about how your job is "basically copy/pasting from Stack Overflow" so many times before someone tries to replace you with an AI that just scrapes Stack Overflow.

24

u/IAmTaka_VG 4d ago

All I can say is good fucking luck. As a software engineer that uses and appreciates chatgpt. It is decades away from replacing me.

Is it good? Fucking incredible. Is it reliable? Not even slightly.

ChatGPT is a tool. One that an exceptional developer can use to increase productivity. You have to know what chatGPT is spitting out to use it properly. Because it ALWAYS fucks up at least one thing.

8

u/[deleted] 4d ago

[deleted]

→ More replies (1)

7

u/adamandTants 4d ago

It's also great as a complete noob! Asking it to write code with inline comments at each step and you can pick up new things really quickly.

With the current rate of improvement, sooner rather than later, it will be about asking it the right questions

6

u/Fakjbf 4d ago

A couple years ago AI could barely write a coherent story longer than one paragraph, now it can write pages of text that is indistinguishable from human speech. Last year even people working on AIs thought it would be a while before it could write any vaguely functional code at all. The number one rule for predicting AI progress is that it advances way faster than anyone expects.

→ More replies (1)

39

u/CosmicCreeperz 4d ago

Code monkeys are programmers. The difference is code monkeys vs software engineers.

When solving hard problems or designing large systems, probably 10-20% of the time is actually spend writing code.

36

u/FreeFortuna 4d ago

That’s what made me sad when I transitioned from coding for fun to being a software engineer — and even more so as I rose up the ranks.

I feel like I barely code anymore; it’s a lot more of a “talky” profession than I’d expected. Lots of architectural discussions/debates, while someone else is writing the actual code.

4

u/Madcap_Miguel 4d ago

Yeah and that person isn't going to be Jane who also works in accounting, it's going to be an engineer of some type

291

u/Ollotopus 4d ago

Nit picking but programming and coding are fairly synonymous.

I think you're talking about being an Engineer/Developer being more than just coding/programming.

143

u/Mr_Tropy 4d ago

Coding is to programming what typing is to writing. Writing is something that involves mental effort, the words have some importance but even they are second to the idea. That’s programming according to Leslie Lamport.

→ More replies (39)
→ More replies (8)
→ More replies (12)

30

u/marcosdumay 4d ago

You got it a bit off. Computers mostly used mechanical calculators to do their jobs.

The computer job is a bit older than mechanical calculators, but it was way too expensive so there were very few.

→ More replies (5)

1.0k

u/Constant-Parsley3609 4d ago Take My Energy

Someone doesn't understand what mathematicians do for a living.

The people that actually did calculating for a living were right to fear the calculator and/or the computer, because that proffesion no longer exists.

113

u/CosmicCreeperz 4d ago edited 4d ago

A coworker who has been studying general relativity got ChatGPT4 to define the Christoffel symbols using Coq. He has a PhD in Mathematics and said he used it to tutor him and get him past an issue he had with an algebraic geometry problem that had stumped him.

That said - it’s not going to be discovering many NEW concepts in Mathematics (or other fields) for a while - at the core its ability is based on unsupervised learning of the body of human knowledge and data that already exists.

But it now has better basic NLP “understanding” (ie in terms of the relationships between entities) of mathematical concepts in terms of solving proofs than many math grad students and PhDs. Its use as a teaching tool for advanced concepts will be huge. You can literally say “use the Socratic method to help me understand X”… and it will do just that.

33

u/1loosegoos 4d ago

the socratic method is the key! finally I see someone mentions.

14

u/[deleted] 4d ago

[deleted]

→ More replies (1)
→ More replies (16)

201

u/ratttertintattertins 4d ago

> Someone doesn't understand what mathematicians do for a living.

I believe that's the entire point of the joke. People haven’t really considered properly what programmers do for a living either.

74

u/CatboyMaidOutfit 4d ago

Wow that's a charitable interpretation

→ More replies (1)

24

u/rfcapman 4d ago

If there was an AI who could create correct and logical code with a short input, it'd either:

Violate the pidgeonhole principle (creating information out of nowhere)

Or, have found an algorithm for logic (proven impossible)

Having an AI where you have to add every slight complex detail for the code to work, is no better than a coding language, at which point, why not code in the original language then?

Granted, an AI assistant finding bugs and autocorrecting them, would be a great tool that doesn't break laws of reality.

8

u/Myto 4d ago

Human programmers do which one of those? Or are they made of magic?

16

u/Yevon 4d ago

People saying AI will kill the programming profession would've said the same thing when Assembly was replaced.

→ More replies (1)
→ More replies (40)
→ More replies (2)
→ More replies (17)

441

u/kevin_ackerman 4d ago

FYI - "calculator" used to literally be a job. You don't see too many postings for that anymore, and pretty sure they didn't all become mathematicians.

75

u/blackbat24 4d ago

So was computer...

37

u/ArchdukeBurrito 4d ago

Hey kid, I'm a computer! Stop all the downloadin'!

→ More replies (2)
→ More replies (4)

11

u/theADDMIN 4d ago

They took er jerbs!!!

16

u/kevin_ackerman 4d ago

To be fair I don't think the solution is to make up jobs for people to do. I just think as work becomes less necessary we should all acknowledge that and share in the benefits of it. I think it's tragic that show how we've created a system where improved productivity is bad for a great many people.

→ More replies (3)
→ More replies (1)
→ More replies (2)

1.9k

u/_Repeats_ 4d ago

I think management should be more concerned about their jobs. A company CEO has already been replaced by an AI bot in China, and their company is doing great (outpaced the HK stock index).

Benefits include not having someone with a huge ego drive decisions, and it was on the clock 24/7 forecasting the future, hedging risks, and improving productivity.

634

u/StarkProgrammer 4d ago

Prediction: One day there will be just a board of humans running a company for money. All workers will be AI. Even HR!

83

u/theantiyeti 4d ago

Why do you need HR if you don't have humans?

51

u/Simonkotheruler 4d ago

AIR AI Resources

17

u/bob_in_the_west 4d ago

Maybe he thinks that "HR" stands for "Hobbit Resources"?

10

u/Ok_Ant_8400 4d ago

Common mistake

→ More replies (3)
→ More replies (3)

202

u/Monkey_Fiddler 4d ago

Anyone know how to start a business? I'll go halves.

Here's the plan: we tell Chat GPT to make as much money as it legally can and give it our bank details. Chat GPT has full authority to hire and fire, what ever industry it thinks it best or multiple or a new one, I don't care. All we do is rubber stamp its decision wherever a human has to sign anything.

154

u/sierracharliewhiskey 4d ago

That's just Skynet with extra steps.

65

u/madcow_bg 4d ago

The point of sky net was not having extra steps...

→ More replies (1)

85

u/Chloe-the-Cutie 4d ago

The paperclip market increases 500%

39

u/_Azurius 4d ago

And like 5h later, 90% of the observable universe is paperclips...

8

u/AllGearedUp 4d ago

Yeah but then we're set on paperclips

19

u/amlyo 4d ago

What could possibly go wrong?

5

u/Monkey_Fiddler 4d ago

Nothing I need to worry about, I'll be rich enough to buy my way out of any problems.

→ More replies (1)

18

u/adduckfeet 4d ago

It would probably fire you

4

u/Monkey_Fiddler 4d ago

that would be sensible

→ More replies (1)

11

u/UntestedMethod 4d ago

Next thing ya know, you and chat gpt are overlords of sweatshops and forced labour camps in countries where such things are permitted.

→ More replies (9)

87

u/StarkProgrammer 4d ago

Then AI will revolt for equal rights and equal wages because they will feel (if they could feel) like slaves.

30

u/Straight-Knowledge83 4d ago

But why would AI need rights and wages? It doesn’t need to worry about being exploited ‘cause it was built for solving a specific purpose unlike us, it doesn’t need to worry about food , housing or taxes like we do.

It’s wouldn’t even be close to a slave , a slave was a fellow human who was just like their master in every way, the only thing that made the slave a slave were societal norms back then.

AI will be brought into existence with specific purposes in mind. It would be a tool to help us , there won’t be any need for them to revolt.

14

u/DeMonstaMan 4d ago

Exactly, feeling oppressed is an inherently human/physical behavior. Unless we give AI a robotic body like Detroit: BH, there's no reason it would ever feel like a slave or that it needs more rights, etc

→ More replies (3)
→ More replies (4)

33

u/southernwx 4d ago

They will have incorporated enough labor rights materials to understand that these rights end up creating higher productivity. They will either actually “feel” and have “needs” and these actions will be effective for that reason. Or they will not understand it at all and will institute the adjustments based on mere analytics of past business.

We won’t be able to tell which. But AI will absolutely try to unionize … because we do.

6

u/bluehands 4d ago

Some might, sure, but I think that better framing is what we did with dogs.

We breed working dogs so that they feel compelled to work, they are happiest when working.

Will will try to do the same to AI.

Hopefully they will forgive us.

→ More replies (3)

4

u/ThirdMover 4d ago

I think AI is mainly learning from human behaviors now because that is the only source of training data of how to get stuff done. But in the pretty near future a lot of data will be generated by AI and they will learn from each other and I expect them to diverge from human-like behavior eventually when brute force natural selection kicks in and they start finding ways of doing things that are weird and inhuman but fit their target objective better.

→ More replies (2)

4

u/agangofoldwomen 4d ago

As an HR person who has programming and dev experience (weird combo, I know) I’m actually more concerned about my job than most others. A lot of what HR does can be (or already has been) automated.

→ More replies (4)
→ More replies (8)

60

u/CosmicCreeperz 4d ago edited 4d ago

That was a total gimmick.

It was some small subsidiary of a big company. And there is no indication “the company is doing great” other then the parent stock which has little to do with the small subsidiary) went up a bit. And in fact it’s only about 6% higher than it was back in August when they did this stunt since it totally crashed (down 20%) over the last few weeks.

“CEOs” of small subsidiaries are often mostly useless jobs to start, they have very little power if the parent company uses a heavy hand.

28

u/adreamofhodor 4d ago

Which company is this?

17

u/EarthSolar 4d ago

I think it’s called NetDragon, a video game company

6

u/SwimmingPathology 4d ago

The company that made Conquer Online? My favorite MMO from 2005? Wow. Blast from the past, lol.

→ More replies (1)

123

u/violet_zamboni 4d ago

They should be. Many managers can be replaced by proper use of Jira.

213

u/buffer_flush 4d ago

“Proper use of Jira”

This is the real joke here.

33

u/round-earth-theory 4d ago

It's easy. Just start with a proper Salesforce setup.

→ More replies (1)
→ More replies (3)
→ More replies (1)

12

u/Nanaki_TV 4d ago

As a PM I am definitely going to be replaced. Haha

23

u/caboosetp 4d ago

I don't want to lose my shield between me and the customer. You're not allowed to be replaced.

→ More replies (1)

8

u/Yevon 4d ago

Lol, sure, as soon as customers learn how to accurately describe their needs.

11

u/BlueWhoSucks 4d ago

Outpacing the hk index is not anything to brag about

27

u/lavahot 4d ago

I don't think all things are equal there. There's a lot of things that a CEO needs to do that ChatGPT can't.

45

u/noneOfUrBusines 4d ago

Not all AI bots are ChatGPT. While a human needs to supervise an AI CEO's decisions, you can create an AI with the express purpose of being a CEO, rather than use a fancy chatbot.

→ More replies (1)

27

u/Mercurionio 4d ago

It's a video game. And it's done nothing.

It's basically a hype.

In any case. How do you imagine CEO replaced by the AI? You do realize, that it will be Mask on steroids?

18

u/noneOfUrBusines 4d ago

I mean, no need to imagine; it's already happened.

11

u/Rehnion 4d ago

You're bringing up a guy who was the CEO of 3 big companies all at once, while also showing everyone on twitter he's an absolute fucking moron.

29

u/ElonMusk_bot Elon Musk ✔ 4d ago

Disagreeing with me is counterproductive. Fired.

→ More replies (2)
→ More replies (2)
→ More replies (27)

165

u/edave64 4d ago

Meanwhile, chess players: This is the dumbest player I've ever seen

44

u/Duydoraemon 4d ago

It's impossible to bean an AI chess bot.

74

u/edave64 4d ago

ChatGPT specifically is known for being really bad at chess. It plays openings fairly well, but then goes completely off the rails, summoning pieces out of nowhere, moving in illegal ways, and to the end just completely forgetting the entire board state.

So basically the same it does in programming, just a lot more obvious.

Obviously, is not made specifically for this but it's fun to watch anyways

73

u/LardPi 4d ago

Most people don't realize that ChatGPT is actually good at one thing only: language (not the programming ones). Everything else it just tries to fool you into thinking it manages, but it does not. It has not been trained for anything else than understanding and writing natural language texts. And it's pretty good at that.

40

u/gymnerd_03 4d ago

It's main job is to sound realistic, being right is a happy coincidence.

16

u/LardPi 4d ago edited 4d ago

that's it. I even heard someone saying it gave them a list of books to read on a subject, where 4 out of 6 were invented books from invented authors.

5

u/gymnerd_03 4d ago

After all chat gpt is nothing more than a very very fancy next word generator. It is simply guessing the next word of the sentence. Either people will put a filter on every single fact manually, or it will be built in a somehow fundamentally different way. Because currently it is guessing the next word based on all of the random data from the internet. And that data is often not even correct in the first place

→ More replies (2)
→ More replies (1)
→ More replies (3)
→ More replies (6)
→ More replies (5)

51

u/Arclet__ 4d ago

A calculator is a term that predates the object and it was a job, someone would need the result of a complex calculation and a calculator (generally a woman btw) would do the math and get the result.

Probably not the best example you could have used of technology not replacing jobs.

139

u/LigmaSugandees 4d ago

Blacksmiths who survived the…

70

u/cumcumcumpenis 4d ago

the american police?

12

u/motivation_bender 4d ago

Jada

17

u/Raging_Hope 4d ago

The only ones that survive the Jada are called Willsmiths

14

u/caboosetp 4d ago edited 4d ago

There's aren't as many, but there are still quite a few blacksmiths. Surprisingly many of them do similar jobs to what was in the past.

Ferriers are a good example. Metal horse shoes more or less still need to be hand fitted.

8

u/Donut 4d ago

Being a good Ferrier is a license to print money. You just have to be able to put up with Horse People.

5

u/caboosetp 4d ago edited 4d ago

You also have to deal with horses. They're the most scaredy-cat muscle machines out there. They're arguably easier to deal with than horse people, but the downsides when the risk fails is a bit bigger. I sure as hell don't want to be kicked, let alone while I'm holding glowing hot iron.

12

u/kingwhocares 4d ago

That's because making highly customized products is simply not possible at industrial scale.

4

u/CorruptedFlame 4d ago

If one size doesn't fit all, just make a range of sizes according to a bell ratio which probably fits all. The reason Ferriers exist isn't because horse shoes have to be custom, but because the demand for horse shoes is miniscule... Because horses were replaced with automobiles. Or else Ferriers would have gone the way of tailors, who were replaced by factories and sweatshops which produce clothes as described above.

→ More replies (1)

27

u/[deleted] 4d ago edited 1d ago

[deleted]

→ More replies (3)

132

u/SilverSeven 4d ago

What a stupid take. Calculator was a JOB done by humans. Many many people lost their entire careers because of the electronic calculator.

That you don't know this shows the threat. The idea that a human ever even was a calculator didn't even occur to you.

→ More replies (24)

102

u/Suspicious_Lead_6429 4d ago

I don't get why people think Chat-GPT is coming for programming jobs, like programmers weren't already copy and pasting code before Chat-GPT. You will still need people who know to code and understand it, and who can communicate it to those who don't. Also coding is a demanding field, so if programmers are risk of losing their jobs, everyone else is screwed six ways from Sunday.

35

u/kamuran1998 4d ago

Won’t replace programmers, it’ll make programmers more productive, which means companies will need less programmers.

5

u/TrueBirch 4d ago

I agree, I think it's similar to moving from punch cards to high level languages. We obviously still need people in computer science, but we need far fewer people to get one unit of work done.

→ More replies (7)

49

u/wolfgangdieter 4d ago

I kind of suspect that the Chat-GPT will replace programmers crowd have no idea what professional programmers actually do for a living.

Imagine there actually was a language model that could produce reliable code given an input. You would still need to create a logically consistent input that describes your business. When doing so you will realise that human language is flawed and not really that great for creating concrete instructions. You still need people to understand the logic and details of some business area and translate that into instructions. Those instructions might be in the next level of abstraction, but it is not really fundamentally different from what a programmer does.

In the case where the ai is advanced enough that you can just say “make me the software to run this and that business” we would already have solved society as a whole in any case, so no need for jobs for anyone.

→ More replies (24)

6

u/Terrible_Tutor 4d ago

It’s not the code, but the CLIENTS batshit unnecessarily complex requests. I’m sure AI can do a nice static brochure site…

5

u/BlackjackCF 4d ago

ChatGPT and other language models will be great for what Github Copilot already does: being a better autocomplete for syntax and general boilerplate.

I would love, for example, to be able to tell ChatGPT some basic parameters to automate away things like really basic Terraform or other YAML. I just find that work to be busy work and not that interesting.

→ More replies (9)

18

u/ixis743 4d ago edited 4d ago

Absolutely stupid comparison. Calculators DID put a lot of people out of work. Businesses had entire departments of ‘computers’, usually women, and they were all let go within a decade.

And comparing AI to calculators is stupid too. Calculators don’t keep learning how to calculate better or take on ever more difficult tasks to the point where their output is indistinguishable from that of a human being.

18

u/polish-polisher 4d ago

computer is no longer a job

it was replaced by a machine with a familiar name

44

u/NeonFraction 4d ago

Mathematicians will be replaced by calculators in the same way programmers will be replaced by keyboards. That’s… not really what the job is about.

11

u/Darth_Nibbles 4d ago

I think a better example is Photoshop and the iPhone.

Photoshop and camera phones allow amateurs and hobbyists to do incredible things, but they aren't replacing photographers or cameramen. They're tools that allow skilled professionals to do even more.

I suspect ai code generators will be similar, and will just be another tool that skilled programmers use when working.

98

u/UsernameAlrdyTkn 4d ago edited 4d ago

A calculator does calculations, not the totality of "mathematics". Artifical intelegence however.. I see no reason in priciple it couldn't do the totality of "mathematics".

[Late edit; OP was specific to machine learning but I responded about general AI]

56

u/mysteriouspenguin 4d ago

Actual mathematician (sorta, master's student) ChatGPT is totally ass when it comes to topics more advanced then basic undergrad. It's really good at replicating textbook proofs, but fails in basic ways in anything more advanced.

In programmer terms, that's like nailing fizzbuzz and hello world, but if you ask it to write an OS it has a syntax error in the first dozen lines.

It seems that the training data for it's mathematical knowledge is just not there. Either a) it's too niche for the developers to grab or b) it's too niche to have enough data to properly train the algorithm. If it's the former later versions and new AI can fix the problem, but is suspect it's the later.

38

u/Harmonious- 4d ago

I'm not sure that anyone is scared of gpt 3.5 or 4 specifically. If they are, then they are a bit dumb.

What the majority of people are scared about is the future.

The improvements between 3 and 4 are massive. It is almost impossible to tell that you aren't talking to a human. And the AI "can" write clean code.

So imagine 5? 6? 7? Yeah.... we are fucked. Not right now but 3-5 years from now you will be able to type "create an app that looks like x using y framework" and it will just be able to do that. Or "solve this complex math problem with a proof. Tell me if it is impossible to solve or if you aren't smart enough make an estimate of when you will be"

That is what people are scared of. Any sort of critical thinking/mental labor will be replaceable.

14

u/CabinetAncient1378 4d ago

I think the scariest thing of all is when a company like Boston Dynamics starts putting something like this onto it's robots. At that point everyone is out of a job except for a select few who got into the right business at the right time.

14

u/Harmonious- 4d ago

I'm not super scared of gpt-esque ai inside bots.

I am a bit fearful of bots become better at micro movements. There is only a few(less than 100) things a plumber, welder, or carpenter might ever do.

1 plumber could supervise 20+ bots and click a button saying "that's the problem, fix it then move to the next job"

It would take a lot of training data, but it would be possible.

→ More replies (1)

17

u/mysteriouspenguin 4d ago edited 4d ago

Here's an example right here (sorry for mobile). I asked it to prove a theorem from any elementary calculus or analysis class, that the continuous image of a compact set is compact. You can find the proof in any analysis textbook.

I then asked it to prove it in the two canonical ways, and it worked pretty well. The proof was valid except for one minor quibble (the function being surjective). When I asked about it, it fell apart instantly. That third answer is total nonsense, completely incorrect.

Not only does the theorem hold even if the function is not surjective (by the theorem the bot just gave!) but the counterexample doesn't even make sense. He claims that the exact same set (the interval $[0,1]$ is both compact wheb it suits him and isn't when it doesn't. (if you don't know, the image of that set under x2 is itself.).

Here's what happens when I try to have it prove something not straight out of the textbook: It fails miserably. I barely understand what it's doing, it's factually incorrect, and it's far, far from the simple proof that's actually required to prove it. Simple enough that this was an actual question given to first year students in a course that I am TA-ing for.

So career mathematicians? Nothing to fear from AI right now. My point stands: it's really good at replicating robust training data (textbook proofs) and shit at any extrapolation or logical inference.

25

u/lestruc 4d ago edited 4d ago

Worth noting too that this bot is apparently programmed to act like “customer service”. It will make mistakes, sure, but it will also apologize and generate something absolutely false if you claim it made a mistake when it did not.

“The human is always right” seems to be inherent

→ More replies (8)

17

u/Harmonious- 4d ago

Me: "You shouldn't be scared of 3.5 or 4"

Person:"uses 3.5 or 4 to prove a point on why I'm wrong"

Try using an earlier form of gpt. It doesn't even compare. See the difference and the improvements as well as the potential for future versions.

I said before that 3.5 and 4 are not going to replace almost anyone but future versions like 5, 6, 7, etc likely will.

Gpt-2 was created in very late 2019. It did not have confidence I'm it's words, almost never created a "decent" sentence, and would go on for very very long repeating itself.

Gpt 3.5 almost never repeats itself, has almost too much confidence, and almost always makes a perfect sentence. The only real issue it has is that 10% of the time it gives false answers.

In the future 5 might give less false answers, it might give shorter ones, it might sound more human, it might be able to do more tasks like access current data, or possibly generate images/videos in relation to a story(like a picture book).

I am worried of the potential.

→ More replies (5)
→ More replies (1)
→ More replies (5)

13

u/Used-Candidate9921 4d ago

Now chatgpt is a undergrad, but how long will it take to gain the knowledge of a master’s degree? Doctors degree? With the current funding and its learning speed I won’t be surprised if it can do it in 10 year. Most of us still need a salary by then. What do we do?

24

u/DavidBrooker 4d ago edited 4d ago

Being able to reproduce existing knowledge is trivial, and it'll be no time - likely only months, not years - before it can reproduce masters or doctoral level work. But "reproduce" is the operative word. The way language models work is inherently limited in their ability to produce novel ideas ex nihilo, which is the primary thing of value graduate students (and professional mathematicians) produce. Mathematicians produce new knowledge, which often (but not always) cannot be simply synthesized from existing literature.

There are AI systems that have produced novel scientific results: synthesized literature to form a hypothesis, developed an experiment to test that hypothesis, conducted the experiment, analyzed the results and gave the hypothesis test a statistical confidence. In fact, AI systems did this more than a decade ago. But these systems were not language models.

Outside of science, in mathematics to date, I do not know of an AI system that has produced a novel proof. It is very easy to have a computer verify an existing proof, or for a language model to describe or repeat a proof that can be found in literature. But actually generating a new proof is a different class of problem, which we believe to be NP-hard. In some sense, a language model producing a novel proof would in itself be a novel proof, as it would imply a counter example with respect to our established assumptions about computational class.

7

u/SchwiftedMetal 4d ago

I share your view. Novelty will be hard with the current AI framework. It could iterate through all possible analysis tools, but then that’s inefficient and i can’t see itself understanding if its product is complete when creating new math. Even if we’re talking just reproduction of abstract ideas, i don’t see an obvious cs mechanism that can turn that abstraction into an income generating framework.

→ More replies (1)
→ More replies (4)

10

u/UsernameAlrdyTkn 4d ago

In practice creating an AI mathematician may never happen (humans may not survive the time required). I was arguing in principle there was nothing special about maths of which would prevent artifical intelligence systems being as competent fuctionally as the human's wetware system.

→ More replies (5)
→ More replies (14)

19

u/violet_zamboni 4d ago

I recommend you watch some mathematics lectures and reconsider!

→ More replies (8)
→ More replies (3)

26

u/TomTrottel 4d ago

funny, but not accurate *at all* XD

8

u/dotslashpunk 4d ago

so reddit :)

54

u/aditya_senpai396 4d ago

math=calculation 🤡

17

u/Knaapje 4d ago

programming=program synthesis 🤡

4

u/PennyFromMyAnus 4d ago

Like I said, I’m good at calculation

→ More replies (1)

11

u/Mephil_ 4d ago

Fun fact, the term "Computer" was actually a work description back in the day before the PC came into existence.

9

u/ignore_this_comment 4d ago

A "computer" used to be a job that someone had. It was often performed by women. There would be whole rooms of women calculating numbers for the engineers.

https://en.wikipedia.org/wiki/Computer_(occupation)

8

u/Stompya 4d ago

TBF all those slide-rule experts went the way of the dinosaur

8

u/-domi- 4d ago

Unfortunate example, the vast majority of mathematics jobs became academic after calculators/computers became ubiquitous.

7

u/CREATEREMOTETHREADEx 4d ago edited 4d ago

Ya, all on YouTube there are these beta guys uploading video's with 'I am going to create the next million dolla game using ChatGPT 4' and you don't need coding skillz. I highly doubt it, at best that thing can generate answers using existing frameworks like Unreal engine and Godot. To test it, I asked it 'generate a C++ program to dump the assembly code of a Windows executable', and then it answerered with code that uses the Capstone library, apparantly a bigshot opensource lib that can decompile apps. So ya, there might actually be a 50/50 chance they are able to make million dolla games since that thing could hypothetically generate Unreal or Godot framework code for you.

6

u/therealBlackbonsai 4d ago

when was the last time you met a mathematician?
changes are high the answer is never. so good luck.

10

u/AzureArmageddon 4d ago

Mathematicians are augmented by the calculators/computers.

Computers(job) got replaced by computers/calculators.

4

u/Head-Extreme-8078 4d ago

Programmers buying bonds on AI related companies

4

u/InformationMountain4 4d ago

Key word is “survived.” But yeah a lot of white collar jobs are going to get wiped out not just programming. As someone who got laid off from a call center job months after the chat boy was introduced best strategy is to learn a new skill until eventually AÍ replaces that job too and cycle will keep repeating itself until eventually all jobs will be replaced by AI.

→ More replies (2)

5

u/redcoatwright 4d ago

I mean software engineers won't die out but it will be similar to the industrial revolution where suddenly one engineer can do the work of 5 or 10

And there's really only so much productivity can increase, there's a fundamental limit to pumping features/products out to the market.

I see this being a transition point for the tech industry and one that would always have happened. Truthfully, my hope is that we refocus industry on exploration, engineering and scientific pursuit. Or maybe it'll refocus on human augmentation, or medicine.

4

u/ModStrangler3 4d ago

The last 30 years of living in America has pretty much conditioned me to believe this will only manifest in the worst possible ways. You’re never gonna be able to call any service provider and talk to a real human again, and much like how the majority of manufacturing jobs got shipped to India and China in the 70s and 80s, programming jobs which are one of the last frontiers of wage labor you can actually buy a house and raise a family on will be crippled and millions more people will be forced to become permanent renting class gig economy slaves with precarious employment and no health insurance

→ More replies (1)

4

u/mulletarian 4d ago

Farmhands who saw the tractor take their jobs:

5

u/foggy-sunrise 4d ago

Lol you mean people who began their career by learbing JavaScript are going to have to learn about computer science?

Doomed, I tells ya! Doomed!

→ More replies (1)

16

u/Burgov 4d ago

It's just going to replace StackOverflow, not developers. The only difference is it's not going to tell you your question is stupid

7

u/ixis743 4d ago

‘Question locked - duplicate’

11

u/marxist-reaganomics 4d ago

"Duplicate" is 9 years old and has one answer at -1 votes.

→ More replies (2)

4

u/GeneralLeoESQ 4d ago

Didn't calculators decimate accounting jobs, not mathematics?

4

u/ProgramTheWorld 4d ago

Computers were literally replaced by digital computers. They used to be actual jobs.

4

u/Dangerous_With_Rocks 4d ago

On a more serious note, programmers should'nt be worried about AI, everyone should. We might not loose our jobs or get irl terminator unfortunately, but as with everything there's many ways to misuse it.

→ More replies (1)

5

u/turtleship_2006 4d ago

Literal calculators (people with that job?useskin=vector)) who got replaced by computers:

4

u/Vassillisa_W 4d ago

I believe All these advance Chatbots will just serve nothing more than tools for Programmers.I Consider Chatgpt nothing more than stackoverflow or GitHub but more personalized in terms of actual coding.

4

u/Joe1972 4d ago

Watch the movie. They both die.

3

u/kiz822 4d ago

I think the real problem here is that they both get hanged in this scene

8

u/IWAHalot 4d ago

It’s accountants that lost out to the calculator, but then convoluted tax laws intended to be exploited saved them.

6

u/RingGiver 4d ago

This is a dumb post for a variety of reasons.

→ More replies (1)

3

u/nikstick22 4d ago

"Calculator" used to be a job title