- Joined
- Apr 3, 2023
- Messages
- 1,092
- Reaction score
- 369
- Points
- 352
- Ballots
- 🗳️0.000000
- DB Transfer
- 🔄0.000000
ChatGPT has passed many exams including some law exams. It is able to write essays.
www.businessinsider.com
Concerns have been raised about plagiarism, however.

ChatGPT can also be used as a resource and tool for software development. It can be used to write, debug, refactor and explain code.
I found this section particularly interesting:
GPT-4 can ace the bar, but it only has a decent chance of passing the CFA exams. Here's a list of difficult exams the ChatGPT and GPT-4 have passed.
OpenAI's generative AI models are getting smarter by the day. We rounded up the difficult exams GPT-4 and ChatGPT have passed.

This gave me a good chuckle.It didn't take long after ChatGPT was released for students to start using it for essays and educators to start worrying about plagiarism.
In December, Bloomberg podcaster Matthew S. Schwartz tweeted that the "take home essay is dead." He noted that he had fed a law school essay prompt into ChatGPT and it had "responded *instantly* with a solid response."
In another instance, a philosophy professor at Furman University caught a student turning in an AI-generated essay upon noticing it had "well-written misinformation," Insider reported.
"Word by word it was a well-written essay," the professor told Insider. As he took a more careful look however, he noticed that the student made a claim about the philosopher David Hume that "made no sense" and was "just flatly wrong" Insider reported.
In an interview in January, Sam Altman— CEO of OpenAI which makes ChatGPT — said that while the company will devise ways to help schools detect plagiarism, he can't guarantee full detection.

ChatGPT can also be used as a resource and tool for software development. It can be used to write, debug, refactor and explain code.

How to use ChatGPT to write code
From generating boilerplate code to debugging or explaining existing code, ChatGPT is a no-brainer way to be a faster, more efficient software engineer.
www.pluralsight.com
The pros
- It’s quick and easy to generate code using ChatGPT. If you were going to StackOverflow or Google to find a code snippet anyway, why not shorten the amount of time it takes to find it?
- ChatGPT generally gets syntax right, potentially saving you time chasing compile-time or runtime errors that we humans can introduce if we’re coding from scratch.
- By using ChatGPT for coding, you might get introduced to some alternative ways of doing things that you wouldn’t have thought about otherwise.
- ChatGPT makes it faster and easier to learn new languages and concepts, with explanations and code all in the same interface.
The cons
- While artificial intelligence has come a long way, and ChatGPT is quite frankly mind-blowing, it’s not 100% correct all the time. It makes mistakes just like us. And sometimes it’s really confident when it makes those mistakes. So you still need to check it, test it and debug it-–just like you’ve done in the past.
- ChatGPT lacks the overall context for what you’re building and why. Sure, it can give you a code snippet, or even an entire code file. But it won’t understand conventions or best practices for your company or project. It doesn’t know how that code will interact with other code. It doesn’t understand requirements around performance, security, privacy, accessibility and so on. Again, as a human, you’re ultimately responsible for the code that ships.
I found this section particularly interesting:
I think the ever-evolving landscape of threats to cyber-security is a great example of why AI can never overtake humans.Is ChatGPT coding trustworthy?
Let’s see what ChatGPT has to say about its own trustworthiness.
![]()
Even ChatGPT is aware of its own limitations! Just as you wouldn’t rely on your phone’s autocorrect functionality to send an important text message, you shouldn’t rely on ChatGPT for writing perfect code.
Take secure coding, for example. Remember that ChatGPT works by “learning” a huge set of existing data. In the case of ChatGPT version 3, it was trained on data through the end of 2021. Because the cybersecurity landscape is constantly changing, and new vulnerabilities appear every day, ChatGPT won’t have the latest intelligence to inform the code it writes.
ChatGPT has no knowledge of your specific requirements for authentication, secrets management, third-party components, vulnerability scanning and so on. It might give you boilerplate code with a placeholder for hard-coded credentials. If you don’t know that those should be stored in an environment variable or in a secrets management service, then this is a huge security risk for your application.
Ultimately, while ChatGPT is good for writing code snippets and simple applications, human developers have the critical job of putting the pieces together and applying best practices.