👺 May your socks always be wet.
- 0 Posts
- 12 Comments
watersnipje@lemmy.blahaj.zoneto Technology@lemmy.world•Users ditch Glassdoor, stunned by site adding real names without consentEnglish1·1 year agoWhat happened that they screamed at you even before an interview?
watersnipje@lemmy.blahaj.zoneto Technology@beehaw.org•‘We definitely messed up’: why did Google AI tool make offensive historical images?10·1 year agoThey switched off image generation after these issues, so it (correctly) said that it couldn’t generate images at the time.
watersnipje@lemmy.blahaj.zoneto Technology@lemmy.world•Nightshade, the free tool that ‘poisons’ AI models, is now available for artists to useEnglish3·1 year agoThank you for explaining. I work in NLP and are not familiar with all CV acronyms. That sounds like it kind if defeats the purpose if it only targets open source models. But yeah, makes sense that you would need the actual autoencoder in order to learn how to alter your data such that the representation from the autoencoder is different enough.
watersnipje@lemmy.blahaj.zoneto Technology@beehaw.org•None of these people exist, but you can buy their books on Amazon anyway3·2 years agoFair enough. They only have to convince the self help books crowd 🙃
watersnipje@lemmy.blahaj.zoneto Technology@beehaw.org•None of these people exist, but you can buy their books on Amazon anyway4·2 years agoYeah, if you already have it then it’s not really an extra cost. But the smaller models perform less well and less reliably.
In order to write a book that’s convincing enough to fool at least some buyers, I wouldn’t expect a Llama2 7B to do the trick, based on what I see in my work (ML engineer). But even at work, I run Llama2 70B quantized at most, not the full size one. Full size unquantized requires 320 GPU vram, and that’s just quite expensive (even more so when you have to rent it from cloud providers).
Although if you already have a GPU that size at home, then of course you can run any LLM you like :)
watersnipje@lemmy.blahaj.zoneto Technology@beehaw.org•None of these people exist, but you can buy their books on Amazon anyway5·2 years agoFor free? The larger models require a lot of hardware.
watersnipje@lemmy.blahaj.zoneto Technology@lemmy.world•New Study: At Least 15% of All Reddit Content is Corporate Trolls Trying to Manipulate Public OpinionEnglish6·2 years agoWhat’s Sam Altman’s connection to Reddit?
watersnipje@lemmy.blahaj.zoneto Technology@lemmy.world•Finding a Tech Job Is Still a NightmareEnglish1·2 years agoThe same resumes as each other? Did they just copy an example online resume?
watersnipje@lemmy.blahaj.zoneto Technology@lemmy.world•[Survey] Can you tell which images are AI generated?English12·2 years ago13/20, I work in AI. The paintings were the hardest for me, because the art style obfuscates some of the AI artefacts that can be tells.
They just finished with their one child policy and now they’re going in the other direction? Seems like policing people’s fertility didn’t go that well the first time.