GPT and LLMs: Good? Bad? Both?

Where we talk about modern advancements like the abacus and printing press.
Soloist
Posts: 5658
Joined: Sat Nov 12, 2016 4:49 pm
Affiliation: CM Seeker

Re: GPT and LLMs: Good? Bad? Both?

Post by Soloist »

Bootstrap wrote: Sun Dec 10, 2023 4:44 pm
Soloist wrote: Fri Dec 08, 2023 11:42 pm
Bootstrap wrote: Fri Dec 08, 2023 5:59 pm Especially if an abstract for a paper needs to be 500 words or something ... or I can't quite find the right words, and I want to generate several possible things that may not be what I want either, but at least give me ideas for what I do want to write.
Wife: Are schools actually ok with this?
Schools are very much wrestling with what this means for academia. I think there are ways to use GPT and LLMs that enhanced education and critical thinking, I also think there are ways to use GPT and LLMs to avoid thinking at all and to cheat. And schools are going to have to figure out which is which. If students have access to the Internet, they have access to GPT and LLMs.

That's similar to other technologies like the Internet, Google, and Wikipedia. Or in earlier days, paper or video. The tools we have change the game.
Soloist wrote: Fri Dec 08, 2023 11:42 pmMaybe i'm getting old, but it sounds very similar to plagiarism, and would seem very likely to be used for that nowadays.
If I do a 500 word abstract that summarizes a paper that I wrote, and I then edit the result, that doesn't seem to be quite plagiarism. Especially if I generate several from different perspectives, then write an abstract with the best from each. If it makes my writing better, and I'm still doing the thinking, it doesn't make me feel too guilty. Should it? How is it different from working with a professional editor?

I suppose this depends partly on context. At work, I could hire an editor to do this work if I had the budget and there would be no ethical issue. And the editor could generally do it better. That's different from writing an essay for school, where it's morally wrong to get that kind of help.
Soloist wrote: Fri Dec 08, 2023 11:42 pmI know there would have been times when I would have loved to have chat GPT help me write things in middle/high school, but it sounds like it's just going to make students lazier. Even if you change some of it to make it your own, you are having a robot do a large portion of your homework for you.
For school assignments, I agree with you. I don't think it's made me lazier in the settings I use it in, though. I'm working just as hard, but differently. I think I'm as actively engaged, and I have to check everything that is generated before I can trust it.
Wife: yah, I can see uses for it in work settings or doing an abstract for a study if you carefully proofread and edit. I can even imagine how useful it would be to help write stories, books or articles if I tweaked things around/took out or added things. I think it might help in those contexts. Given how many novels are just clones of other novels, I don’t see an ethical problem with having an ai help you write a book as long as it’s not a piece of junk.

I just remember how big my school was about preventing plagerism and they could run a text search on whether your paper showed up anywhere else online. If I’m writing a story or an essay for class, even if I do several different versions and change some of it to make it my own, I would have it done a lot quicker and easier than if I had to do it without ai help. The average student would not use chatgpt for only honorable purposes if they thought they could get away with it. I guess running it through a test to see how much was AI generated would help, but I can see a big headache for high school and college teachers.
1 x
Soloist, but I hate singing alone
Soloist, but my wife posts with me
Soloist, but I believe in community
Soloist, but I want God in the pilot seat
User avatar
Pelerin
Posts: 503
Joined: Sat Apr 07, 2018 9:48 pm
Affiliation:

Re: GPT and LLMs: Good? Bad? Both?

Post by Pelerin »

Bootstrap wrote: Fri Dec 08, 2023 5:22 pm
mike wrote: Fri Dec 08, 2023 4:20 pm One problem I can see is that AI generated text is insanely BORING. The best way to get me to ignore a web page or forum post or anything else is to let it be obviously AI generated. I'm not saying it has no useful purpose. But it sure doesn't take the place of a human writer. Internet searches on a topic, more and more, are turning up AI generated pages that are nothing but summaries of information found elsewhere, with no useful purpose other than serving ads for the website owners. AI generated articles tweaked for SEO must be making some of these guys big money in ad revenue. It's one of the modern equivalents of junk mail, with the exception that creating and hosting these sorts of websites is extraordinarily cheap compared to actually sending out mail pieces.
I asked ChatGPT to process the above text as follows:
User
rewrite the following to be more interesting - add some emotion, make it personal and relateable
Here's the result:

Navigating the web these days feels like wading through a sea of AI-generated content. It's like expecting a gourmet meal but ending up with bland fast food. Sure, AI is great for gathering info quickly, but it lacks the human touch that brings writing to life. These AI articles, optimized for SEO and ad revenue, are like digital junk mail – plentiful and monotonous. They're efficient, no doubt, but where's the soul? It's like we're trading the vibrant, quirky charm of human writing for a convenient yet uninspiring sea of sameness. Sometimes, I just miss that spark of personality and emotion that only human writers can bring.
This makes Mike’s point pretty well I think. The GPT output is overwrought and reads like a high schooler’s idea of good writing, especially, for example, the parts I put in italics (I kind of wish we had a better markup system for commenting). “Be sure to use at least two similes and one metaphor.”

Worse than bad writing, though, is that GPT has inserted a few ideas of its own that it didn’t get from Mike and that it couldn’t have reasonably inferred from what Mike actually wrote. Look at the sentences in GPT’s response that I’ve underlined. Mike didn’t say anywhere that AI is good for gathering information quickly or complement it for being efficient or convenient. Indeed, as a human I would infer that Mike finds that getting the information he’s looking for is harder and less convenient because of AI-generated sites.

Compare the sentences that I’ve underlined in Mike’s post with GPT’s response and you see that it omitted these ideas. There’s no mention that the AI sites are redundant and useless. They are merely “optimized” for ads. The money grubbers driving all this doesn’t even get a mention. Mike’s comparison of AI sites with junk mail, which he meant literally, is just turned into an awkward simile.

Finally look at what is emphasized. GPT decided that Mike’s main concern was the human touch of the writing, apparently having inferred this from one word, “boring”, in his first sentence. Even that’s a bit of a stretch to equate “boring” with a lack of “quirky charm”. Pretty sure quirky charm isn’t what Mike was looking for. Nonetheless GPT makes this element about half of its rewrite made up mostly of clichés—“where’s the soul?” It’s high school-level writing again but that only makes sense; the clichés are serving the same function they do in high school essays: just filling up space because you didn’t really understand the assignment.
0 x
Ernie
Posts: 5545
Joined: Thu Oct 20, 2016 2:48 pm
Location: Central PA
Affiliation: Anabaptist Umbrella
Contact:

Re: GPT and LLMs: Good? Bad? Both?

Post by Ernie »

I don't think ChatGPT is good for producing youtube videos.

e.g. A very pointless video in my opinion.
1 x
The old woodcutter spoke again. “It is impossible to talk with you. You always draw conclusions. Life is so vast, yet you judge all of life with one page or one word. You see only a fragment. Unless you know the whole story, how can you judge?"
Post Reply