They have lots of use cases for red team. Recon, enumeration, exploit chaining, fuzzing. It doesn’t matter if the error rate is 10-20% a shell is a shell
It can help you write the patch. Identify threats in a SIEM or SOAR setup. But I can’t think of much else. Defense has to be correct. If your .htaccess file is 99% correct that’s a problem
Does anyone have a good litmus test for when the perspective might shift?
TurboQuant making it easier to have larger context windows for local models give me a pinch of hope and I’m really holding out for a decent Open-Weights model I can self host for Home Automation.
I’m fully aware LLMs are just predictive text on roids and we haven’t achieved real AGI, but do we know of anything that will help us filter through the marketing?
You can do it locally now pretty easily depending on your use ass and hardware, huggingface has all the models you’d need and use something like llama-swap
But everyone on Lemmy said LLMs had no usecases
They have lots of use cases for red team. Recon, enumeration, exploit chaining, fuzzing. It doesn’t matter if the error rate is 10-20% a shell is a shell
I imagine it has plenty of use cases for blue team as well, just not as many for active threat response.
It can help you write the patch. Identify threats in a SIEM or SOAR setup. But I can’t think of much else. Defense has to be correct. If your .htaccess file is 99% correct that’s a problem
Can see a few people disagree with you
Does anyone have a good litmus test for when the perspective might shift? TurboQuant making it easier to have larger context windows for local models give me a pinch of hope and I’m really holding out for a decent Open-Weights model I can self host for Home Automation.
I’m fully aware LLMs are just predictive text on roids and we haven’t achieved real AGI, but do we know of anything that will help us filter through the marketing?
You can do it locally now pretty easily depending on your use ass and hardware, huggingface has all the models you’d need and use something like llama-swap
There’s millions of YouTube videos on this subject.
Qwen3.5 is very capable and you can run it on any hardware you have. Just depends on the model size
you’re seeing massive cope because most of lemmy is tech workers