LanternEverywhere
- 0 Posts
- 57 Comments
LanternEverywhere@kbin.socialto Technology@lemmy.world•Apple Will Revamp Siri to Catch Up to Its Chatbot Competitors [using generative AI]11·1 year agoIt’s amazing how very far ahead siri was at the beginning, and how extremely far behind it became. It’s a massive miss for Tim Cook’s Apple
LanternEverywhere@kbin.socialto Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ@lemmy.dbzer0.com•Any way to rip high-quality from Apple Music? Or Spotify?912·1 year agoJust torrent the music you want
LanternEverywhere@kbin.socialto Technology@lemmy.world•UK Trial: Pornhub's Chatbot Halts Millions from Accessing Child Abuse Content1·1 year agoCan you very loosely tell me what that is so I don’t have to google it?
LanternEverywhere@kbin.socialto Selfhosted@lemmy.world•What storage manufacturer do you swear to and why?2·1 year agoYup, knock on wood, I’ve had lots of Seagate drives over the decades and I’ve never had any of them go bad. I’ve had two WD drives and they both failed
LanternEverywhere@kbin.socialto Technology@beehaw.org•Hallucination is Inevitable: An Innate Limitation of Large Language Models (arxiv preprint)1·1 year agoI just asked chatgpt4 that exact question copy and pasted, and here is its response:
There was no president of the United States in 3500 BC. The United States of America did not exist as a country until 1776, when the Declaration of Independence was signed. ¹ The first president of the United States was George Washington, who took office in 1789. ² Before that, the territory that is now the United States was inhabited by various Native American tribes and colonized by European powers such as Spain, France, and Great Britain. ³¹: Declaration of Independence
²: List of presidents of the United States - Wikipedia
³: [History of the United States - Wikipedia].Source: Conversation with Bing, 2/27/2024
(1) List of presidents of the United States - Wikipedia. https://en.wikipedia.org/wiki/List_of_Presidents_of_the_United_States.
(2) List of presidents of the United States - Simple English Wikipedia, the … https://simple.wikipedia.org/wiki/List_of_Presidents_of_the_United_States.
(3) President of the United States - Wikipedia. https://en.wikipedia.org/wiki/President_of_the_United_States.
(4) List of presidents of the United States - Wikipedia. https://en.wikipedia.org/wiki/List_of_Presidents_of_the_United_States.
(5) List of presidents of the United States - Wikipedia. https://en.wikipedia.org/wiki/List_of_Presidents_of_the_United_States.
(6) List of U.S. Presidents in Chronological Order - HistoryNet. https://www.historynet.com/us-presidents/.
(7) Getty Images. https://www.gettyimages.com/detail/photo/seal-of-the-president-of-the-united-states-royalty-free-image/1084903580.
LanternEverywhere@kbin.socialto Technology@beehaw.org•Hallucination is Inevitable: An Innate Limitation of Large Language Models (arxiv preprint)1·1 year agoAre you using 4? Because it’s much better than the earlier versions
LanternEverywhere@kbin.socialto Technology@beehaw.org•Hallucination is Inevitable: An Innate Limitation of Large Language Models (arxiv preprint)1·1 year agoThat’s a fun story, but isn’t applicable to the topic here. That could very easily be verified as true or false by a secondary system. In fact you can just ask Wolfram Alpha. Ask it what are the odds that any two people share the same birthday. I just asked it that exact question and it replied 1/365
EDIT
in fact I just asked that exact same question to chatgpt4 and it also replied 1/365
LanternEverywhere@kbin.socialto Technology@beehaw.org•Hallucination is Inevitable: An Innate Limitation of Large Language Models (arxiv preprint)1·1 year agoThere are already existing multiple different LLMs that are essentially completely different. In fact this is one of the major problems with LLMs, because when you add even a small amount of change into an LLM it turns out to radically alter the output it returns for huge amounts of seemingly unrelated topics.
For your other point, I never said bouncing their answers back and forth for verification was trivial, but it’s definitely doable.
LanternEverywhere@kbin.socialto Technology@beehaw.org•Hallucination is Inevitable: An Innate Limitation of Large Language Models (arxiv preprint)1·1 year agoThat’s not a problem at all, I already use prompts that allow the LLM to say they don’t know an answer, and it does take that option when it’s unable to find a correct answer. For instance I often phrase questions like this “Is it known whether or not red is a color in the rainbow?” And for questions where it doesn’t know the answer it now will tell you it doesn’t know.
And to your other point, the systems may not be capable of discerning their own hallucinations, but a totally separate LLM will be able to do so pretty easily.
LanternEverywhere@kbin.socialto Technology@beehaw.org•Hallucination is Inevitable: An Innate Limitation of Large Language Models (arxiv preprint)2·1 year agoGive an example of a statement that you think couldn’t be verified
LanternEverywhere@kbin.socialto Technology@beehaw.org•Hallucination is Inevitable: An Innate Limitation of Large Language Models (arxiv preprint)2·1 year agoNo, I’ve used LLMs to do exactly this, and it works. You prompt it with a statement and ask “is this true, yes or no?” It will reply with a yes or no, and it’s almost always correct. Do this verification through multiple different LLMs and it would eliminate close to 100% of hallucinations.
EDIT
I just tested it multiple times in chatgpt4, and it got every true/false answer correct.
LanternEverywhere@kbin.socialto Technology@beehaw.org•Hallucination is Inevitable: An Innate Limitation of Large Language Models (arxiv preprint)3·1 year agoI extremely doubt that hallucination is a limitation in final output. It may be an inevitable part of the process, but it’s almost definitely a surmountable problem.
Just off the top of my head I can imagine using two separate LLMs for a final output, the first one generates an initial output, and the second one verifies whether what it says is accurate. The chance of two totally independent LLMs having the same hallucination is probably very low. And you can add as many additional separate LLMs for re-verification as you like. The chance of a hallucination making it through multiple LLM verifications probably gets close to zero.
While this would greatly multiply the resources required, it’s just a simple example showing that hallucinations are not inevitable in final output
LanternEverywhere@kbin.socialto Technology@lemmy.world•You Don’t Need to Use Airplane Mode on Airplanes37·1 year agoOn Android you do have that toggle
I think this must be what’s happening. 5G cell is equal or better than 4G cell in almost all ways. And if your phone is set to dynamically switch between 4g and 5g depending on what’s best at any given moment then there’s literally no downside, only upsides
LanternEverywhere@kbin.socialto Technology@lemmy.world•Stop putting your wet iPhone in rice, says Apple. Here’s what to do instead1·1 year agoBut, dry from what? How often are your electronics encountering a meaningful amount of moisture?
LanternEverywhere@kbin.socialto Technology@lemmy.world•Stop putting your wet iPhone in rice, says Apple. Here’s what to do instead2·1 year agoWhat do you use them for?
LanternEverywhere@kbin.socialto Technology@lemmy.world•Stop putting your wet iPhone in rice, says Apple. Here’s what to do instead20·1 year agoBut silica packets stop doing anything once they’ve absorbed moisture, and so aren’t reusable once they’ve been exposed to normal air moisture. (Unless you’ve baked them to reactivate them). Is that not right? Because basically no one has a box full of re-baked silica packets hanging around ready for emergency usage.
LanternEverywhere@kbin.socialto Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ@lemmy.dbzer0.com•Study finds anti-piracy messages backfire, especially for men91·1 year agoExactly. If there was a Spotify-like service for video where i could get 99.9% of all tv and movies of all time in one place without ads, then I’d be willing to pay like 40 bucks a month, maybe even 50. But since no video service is even remotely close to that, then i just pirate instead, which provides exactly that type of service, and costs zero dollars a month.
And on top of which, these are definitely noisy nuisances, and as a result people are gonna fuck with them, and it’s inevitable that eventually it’s gonna hurt someone, which’ll cause a huge backlash and expensive lawsuits, etc etc etc.
The only situation i can see these being even potentially viable is in very rural areas, where delivery routes are expensive, people have lots of open land for it to safely reach the ground, and there aren’t a lot of nearby neighbors to annoy