text
stringlengths 0
23.7k
| label
stringclasses 4
values | dataType
stringclasses 2
values | communityName
stringclasses 4
values | datetime
stringclasses 95
values |
---|---|---|---|---|
If recall doesn’t help cause the AI fad to slow, what will? | r/technology | comment | r/technology | 2024-05-06 |
What’s the point of Recall being accessible in this state if the hardware and software hasn’t even been released yet? I’m surprised Microsoft doesn’t have that more locked down | r/technology | comment | r/technology | 2024-05-06 |
At least they’re getting more power efficient. You’ll only need 2 dedicated nuclear reactors instead of the 3 you need for the last gen. | r/technology | comment | r/technology | 2024-05-06 |
After 30 years of using Windows, my next computer will be an Apple. | r/technology | comment | r/technology | 2024-05-06 |
Thanks for sharing our story. For our new readers, here's a little snippet from the piece:
The [Windows Recall system](https://www.wired.com/story/microsoft-recall-alternatives/) takes screenshots of your activity every five seconds and saves them on the device. But security experts say that data may not stay there for long.
Two weeks ahead of [Recall’s launch on new Copilot+ PCs on June 18](https://www.wired.com/story/everything-announced-microsoft-surface-event-2024/), security researchers have demonstrated how preview versions of the tool store the screenshots in an unencrypted database. The researchers say the data could easily be hoovered up by an attacker. And now, in a warning about how Recall could be abused by criminal hackers, Alex Hagenah, a cybersecurity strategist and ethical hacker, has released a demo tool that can automatically extract and display everything Recall records on a laptop.
Read the full story: [https://www.wired.com/story/total-recall-windows-recall-ai/](https://www.wired.com/story/total-recall-windows-recall-ai/) | r/technology | comment | r/technology | 2024-05-06 |
Which sux, i absolutely hate the mac os. So fn clunky | r/technology | comment | r/technology | 2024-05-06 |
We use PowerPlatform at work and there are bugs that have existed since release in 2018. | r/technology | comment | r/technology | 2024-05-06 |
I don't think that's how insurance works at all. There's tons of Windows features that would be a much larger risk than this, like using insecure protocols, but you can still get insurance. Hell, there's no OS that doesn't include absurdly risky features. | r/technology | comment | r/technology | 2024-05-06 |
Having to dick around with stuff like that for hours isn't a learning issue, that's a weakness of the OS. I recently had a similar issue when I was trying to get an Xbox controller working on Linux. I manage Linux servers for work, but my Linux desktop experience just isn't very good. I'd rather spend a minute or 2 disabling stuff on Windows when necessary than have to deal with Linux desktop issues forever. The one thing that would make me switch is if Windows accounts became mandatory. | r/technology | comment | r/technology | 2024-05-06 |
It would just be a line item on an audit. There's tons of stuff that companies need to manage to adhere to various different regulations. | r/technology | comment | r/technology | 2024-05-06 |
When AI starts flashing itself as an OS ima leave windows so fast.... | r/technology | comment | r/technology | 2024-05-06 |
>The one thing that would make me switch is if Windows accounts became mandatory.
So the spyware that reportedly can use up to 16GB of memory for no benefit to you and is opting people in by defualt with an update isn't a big deal to you? Because that was the deal-breaker for me.
Agreed with those issues being an OS thing, but I don't think windows doesn't have those issues either, think we're just much more used to windows' problems. Like, I bought a Victrix BFG. They decided to only release their software via a windows store app. A feature I disabled early on because the windows app store sucks. So to get that controller firmware updated and buttons map-able, I had to spend a few hours rolling back all the shit I disabled. All because we couldn't just have an executable. Sure, you could say that's on the software provider, but same could be said for logitech not supporting linux.
End of the day, I agree the convenience of windows makes the switch harder for most people and linux is far from perfect. But I also think we live in a time where I wouldn't be surprised if sometime in the not so distant future a fascist with an understanding of tech makes their way into power and all that data they've been collecting for advertizing purposes, might just end up being used against "undesirables". Shit, we're already seeing politicians trying to find ways to get their hands on abortion data. | r/technology | comment | r/technology | 2024-05-06 |
Yeah, I struggle to think of a single reason someone would really want this, but the scenarios you're listing are the real biggest threats. From a pure security standpoint this expands the attack surface a bit, but probably isn't a huge deal, but the blackmail/manipulation potential is huge. | r/technology | comment | r/technology | 2024-05-06 |
What a stupid comment.
I dualboot Linux distro(s) and Windows' for quite some time, even Linux distros as main OS.
It's not for everyone. I had so many distros where Libreoffice would not work. Literally just fresh install, install libre via repo/gui store/cli whatever you want, open --> loads for a few sec and crashes.
That's before we delve into stuff like certain hiccups on certain distros that require extensive analysis. Had a 6tb drive that would randomly freeze debian.
Lots of applications don't work or exist on linux distros and recommending alternatives is not what people are looking for, period.
No I don't want the pile of junk that is CodeBlocks or paid CLion! I want "Visual Studio" (no "VSCode" is NOT an IDE, stop recommending it when people explicitly ask for VS!).
Blender crashes for me on Arch, the moment things get a bit too nice.
Plenty of games straight up don't work or need extreme luck with Wine aka a translation layer.
Have fun spending hours getting GTAV working on Linux, only to end up with extrmeely blurry textures when R* releases a damn hotfix...(yes, first hand exp.)
Modding and lots of other things work with wine but there are some complexities or stuff you need to get used to obviously.
Then there's XZ 5.6.1, lurking 1month without anyone noticing because after all, Open Source means everyone audits code which means we are safe......Libraries like glibc direly need maintainers, there was a request for one a year ago....glibc, a very integral lib....in need of maintainers.
## Or you could dualboot Windows 10
which will quickly make you boot less into linux distro over time depending on which OS has more of your day-to-day work. | r/technology | comment | r/technology | 2024-05-06 |
The outlined experiences occured on many distros with different setups across different times on different machines with and without dualboot with a user that has almost a decade of exp. with linux distros.
Don't bother replying with "meh user-fault".... | r/technology | comment | r/technology | 2024-05-06 |
Congrats. You found out why they want every machine sold now to have a TPM 2 module | r/technology | comment | r/technology | 2024-05-06 |
How is a local feature that can be disabled a GDPR nuke? | r/technology | comment | r/technology | 2024-05-06 |
It is local only and stores up to a specified point before throwing out old data. | r/technology | comment | r/technology | 2024-05-06 |
Yeah, no. I'm all for encryption where it matters or is important. The average person is more likely to have some of their hardware fail on them than for them to ever need bitlocker through.
Businesses should have it by default, governments as well. But not the average consumer who is more worried about if they can try to get their old stuff back after they drop their laptop and damage something inside. | r/technology | comment | r/technology | 2024-05-06 |
until those processors became standard and Intel/AMD start using them exclusively. So the next time you upgrade, BOOM Recall says go | r/technology | comment | r/technology | 2024-05-06 |
until a Windows update randomly turns it back on without you knowing... | r/technology | comment | r/technology | 2024-05-06 |
This was not a serious feature. It’s just to move the overton window to include surveillance for enterprises that dont trust work from home employees. They’ll pull it out but leave the hooks and a whole new cottage industry of plugins will propagate. | r/technology | comment | r/technology | 2024-05-06 |
Sounds like they need broken up then | r/technology | comment | r/technology | 2024-05-06 |
I use a Mac for work and I still do half of the normal OS stuff through terminal. Finder is such an ass file explorer. | r/technology | comment | r/technology | 2024-05-06 |
I mainly use my pc for videogames, so that’s not gonna work | r/technology | comment | r/technology | 2024-05-06 |
After 3 years, a solemn ceremony and dedicated festivities should inaugurate them as features. | r/technology | comment | r/technology | 2024-05-06 |
Didnt read the article, is one of them the controller batteries dying after like a few uses? My htc vive wands id charge maybe once every few months because i dont play often. The quest 3s controllers are constantly dying because they keep going on or staying on, you have to make sure the controllers dont move or press in the grip button or else you are screwed the next time you go to use them. | r/technology | comment | r/technology | 2024-05-06 |
r/technology | post | r/technology | 2024-05-06 |
|
r/technology | post | r/technology | 2024-05-06 |
|
https://edtrust.org/the-equity-line/the-literacy-crisis-in-the-u-s-is-deeply-concerning-and-totally-preventable/
The Literacy Crisis in the U.S. is Deeply Concerning—and Totally ... | r/technology | comment | r/technology | 2024-05-06 |
Reddit is also brainrot, just a different kind. Pick your poison | r/technology | comment | r/technology | 2024-05-06 |
r/technology | post | r/technology | 2024-05-06 |
|
r/technology | post | r/technology | 2024-05-06 |
|
Nvidia reported eye-popping revenue last week. Elon Musk just said human-level artificial intelligence is coming next year. Big tech can’t seem to buy enough AI-powering chips. It sure seems like the AI hype train is just leaving the station, and we should all hop aboard.
But significant disappointment may be on the horizon, both in terms of what AI can do, and the returns it will generate for investors.
The rate of improvement for AIs is slowing, and there appear to be fewer applications than originally imagined for even the most capable of them. It is wildly expensive to build and run AI. New, competing AI models are popping up constantly, but it takes a long time for them to have a meaningful impact on how most people actually work.
These factors raise questions about whether AI could become commoditized, about its potential to produce revenue and especially profits, and whether a new economy is actually being born. They also suggest that spending on AI is probably getting ahead of itself in a way we last saw during the fiber-optic boom of the late 1990s—a boom that led to some of the biggest crashes of the first dot-com bubble.
The pace of improvement in AIs is slowing
Most of the measurable and qualitative improvements in today’s large language model AIs like OpenAI’s ChatGPT and Google’s Gemini—including their talents for writing and analysis—come down to shoving ever more data into them.
These models work by digesting huge volumes of text, and it’s undeniable that up to now, simply adding more has led to better capabilities. But a major barrier to continuing down this path is that companies have already trained their AIs on more or less the entire internet, and are running out of additional data to hoover up. There aren’t 10 more internets’ worth of human-generated content for today’s AIs to inhale.
To train next generation AIs, engineers are turning to “synthetic data,” which is data generated by other AIs. That approach didn’t work to create better self-driving technology for vehicles, and there is plenty of evidence it will be no better for large language models, says Gary Marcus, a cognitive scientist who sold an AI startup to Uber in 2016.
AIs like ChatGPT rapidly got better in their early days, but what we’ve seen in the past 14-and-a-half months are only incremental gains, says Marcus. “The truth is, the core capabilities of these systems have either reached a plateau, or at least have slowed down in their improvement,” he adds.
Further evidence of the slowdown in improvement of AIs can be found in research showing that the gaps between the performance of various AI models are closing. All of the best proprietary AI models are converging on about the same scores on tests of their abilities, and even free, open-source models, like those from Meta and Mistral, are catching up. | r/technology | comment | r/technology | 2024-05-06 |
AI could become a commodity
A mature technology is one where everyone knows how to build it. Absent profound breakthroughs—which become exceedingly rare—no one has an edge in performance. At the same time, companies look for efficiencies, and whoever is winning shifts from who is in the lead to who can cut costs to the bone. The last major technology this happened with was electric vehicles, and now it appears to be happening to AI.
The commoditization of AI is one reason that Anshu Sharma, chief executive of data and AI-privacy startup Skyflow, and a former vice president at business-software giant Salesforce, thinks that the future for AI startups—like OpenAI and Anthropic—could be dim. While he’s optimistic that big companies like Microsoft and Google will be able to entice enough users to make their AI investments worthwhile, doing so will require spending vast amounts of money over a long period of time, leaving even the best-funded AI startups—with their comparatively paltry warchests—unable to compete.
This is happening already. Some AI startups have already run into turmoil, including Inflection AI—its co-founder and other employees decamped for Microsoft in March. The CEO of Stability AI, which built the popular image-generation AI tool Stable Diffusion, left abruptly in March. Many other AI startups, even well-funded ones, are apparently in talks to sell themselves.
Today’s AI’s remain ruinously expensive to run
An oft-cited figure in arguments that we’re in an AI bubble is a calculation by Silicon Valley venture-capital firm Sequoia that the industry spent $50 billion on chips from Nvidia to train AI in 2023, but brought in only $3 billion in revenue.
That difference is alarming, but what really matters to the long-term health of the industry is how much it costs to run AIs.
Numbers are almost impossible to come by, and estimates vary widely, but the bottom line is that for a popular service that relies on generative AI, the costs of running it far exceed the already eye-watering cost of training it. That’s because AI has to think anew every single time something is asked of it, and the resources that AI uses when it generates an answer are far larger than what it takes to, say, return a conventional search result. For an almost entirely ad-supported company like Google, which is now offering AI-generated summaries across billions of search results, analysts believe delivering AI answers on those searches will eat into the company’s margins.
In their most recent earnings reports, Google, Microsoft and others said their revenue from cloud services went up, which they attributed in part to those services powering other company’s AIs. But sustaining that revenue depends on other companies and startups getting enough value out of AI to justify continuing to fork over billions of dollars to train and run those systems. That brings us to the question of adoption. | r/technology | comment | r/technology | 2024-05-06 |
Narrow use cases, slow adoption
A recent survey conducted by Microsoft and LinkedIn found that three in four white-collar workers now use AI at work. Another survey, from corporate expense-management and tracking company Ramp, shows about a third of companies pay for at least one AI tool, up from 21% a year ago.
This suggests there is a massive gulf between the number of workers who are just playing with AI, and the subset who rely on it and pay for it. Microsoft’s AI Copilot, for example, costs $30 a month.
OpenAI doesn’t disclose its annual revenue, but the Financial Times reported in December that it was at least $2 billion, and that the company thought it could double that amount by 2025.
That is still a far cry from the revenue needed to justify OpenAI’s now nearly $90 billion valuation. The company’s recent demo of its voice-powered features led to a 22% one-day jump in mobile subscriptions, according to analytics firm Appfigures. This shows the company excels at generating interest and attention, but it’s unclear how many of those users will stick around.
Evidence suggests AI isn’t nearly the productivity booster it has been touted as, says Peter Cappelli, a professor of management at the University of Pennsylvania’s Wharton School. While these systems can help some people do their jobs, they can’t actually replace them. This means they are unlikely to help companies save on payroll. He compares it to the way that self-driving trucks have been slow to arrive, in part because it turns out that driving a truck is just one part of a truck driver’s job.
Add in the myriad challenges of using AI at work. For example, AIs still make up fake information, which means they require someone knowledgeable to use them. Also, getting the most out of open-ended chatbots isn’t intuitive, and workers will need significant training and time to adjust.
Changing people’s mindsets and habits will be among the biggest barriers to swift adoption of AI. That is a remarkably consistent pattern across the rollout of all new technologies.
None of this is to say that today’s AI won’t, in the long run, transform all sorts of jobs and industries. The problem is that the current level of investment—in startups and by big companies—seems to be predicated on the idea that AI is going to get so much better, so fast, and be adopted so quickly that its impact on our lives and the economy is hard to comprehend.
Mounting evidence suggests that won’t be the case. | r/technology | comment | r/technology | 2024-05-06 |
That will age like milk. The field is in its infancy, the first step will be to make them more efficient, and then it will be to bundle them together into llm networks. | r/technology | comment | r/technology | 2024-05-06 |
im a downvote farmer 🧑🌾 | r/technology | comment | r/technology | 2024-05-06 |
You still have to map the intent to the action, llm is not a panacea it is not a handwavey operation to apply llm to siri | r/technology | comment | r/technology | 2024-05-06 |
"My stupid google assistant can't even handle, "clear the timer and set another one for 5 minutes." "
No shit cause you only have to ask "set timer for 5 minutes" a second time for it to set the new timer, clearly a user error. | r/technology | comment | r/technology | 2024-05-06 |
Who can argue with insightful analysis such as this? | r/technology | comment | r/technology | 2024-05-06 |
Well, anyone not already rich get rich? No? Well, I guess we failed. Uhhhh… Can we do that GameStop thing again? | r/technology | comment | r/technology | 2024-05-06 |
Yeah no. It's still going like a raging bull.
WSJ is out of touch. | r/technology | comment | r/technology | 2024-05-06 |
OpenAI is expected to have their biggest improvement, possibly GPT-5, out in November. I would say wait until then to decide if the pace of improvement is slowing down. | r/technology | comment | r/technology | 2024-05-06 |
lol.
This is just like the guy that said "oh the internet? That'll never take off!" | r/technology | comment | r/technology | 2024-05-06 |
I have no doubt that there are a ton of people out there who went "chatgpt cool" and then couldn't think of anything to do with it after playing with it for a while. Fine with me. More for us! | r/technology | comment | r/technology | 2024-05-06 |
Depending on who you listen to, either they're trying for a smoother process of transition over time, or they're failing to produce massive transitional leaps. | r/technology | comment | r/technology | 2024-05-06 |
Because no one can properly explain what AGI is. | r/technology | comment | r/technology | 2024-05-06 |
3d printing is huge it’s just requires more technical skill then the average pleb, ai is the same. It can do some crazy things but the average person doesn’t know how to leverage it in a practical manner. | r/technology | comment | r/technology | 2024-05-06 |
AI is already better than the average person at determining whether a source is credible. It certainly isn't going to get worse. It's obviously not enough to replace experts yet, but multimodal checks are just starting to come out.
People said math would always be an issue and then GPT-4o pretty much solved it. You have no idea what LLMs will "never" be able to do, and there's no reason to assume it will always be an LLM. | r/technology | comment | r/technology | 2024-05-06 |
It’s 3d tv all over again | r/technology | comment | r/technology | 2024-05-06 |
I just bought a microtonal MIDI keyboard whose body and keys were 3D printed. I think that small time manufacturing has found uses for them. I’ve also seen some interesting jewelry that was 3D printed. | r/technology | comment | r/technology | 2024-05-06 |
There is a huge amount of difference between writing code for a computer to interpret and writing a financial report, memo, MOU, proposal, project case study, etc.
Each of these are different situations that require different levels of manual edits. Yes, manual edits, as I stated. The 20% is a manual process. If you write a 30 page memo quarterly with pre-set categories of information, then yes this makes that effort easier.
Writing an email - except maybe marketing emails, is not the use case I was referring to. How often do you have to collect, collate and format information in an email? Its not often. For those that need that level of aggregation from multiple sources, AI is probably useful. But most emails dont fit the profile of work I defined previously. Neither does writing code.
We get it, it doesnt help you in your job. There is in fact more to the world than your perspective. | r/technology | comment | r/technology | 2024-05-06 |
Mixed reality is very very big and it's going to get bigger.
There have been massive advances in computer vision and hardware. We aren't there yet but a huge number of investments are happening right now from the big players. | r/technology | comment | r/technology | 2024-05-06 |
I remember seeing executives get sold on Blockchain and VR and having everyone run around looking for potential applications to avoid being left out. It's currently the same for generative AI, with the exception that some of the applications actually do have value.
I don't expect generative AI to become completely irrelevant like Blockchain and VR have. | r/technology | comment | r/technology | 2024-05-06 |
I stoped when he mentioned Elon Musk in the second sentence. | r/technology | comment | r/technology | 2024-05-06 |
I still think it is worth a read. For a laugh.
Just to remind everyone: Musk promised that full self-drive was a year away... back in 2016. So 7 years ago if we are being generous.
Musk is not a technologist. He doesn't know anything about AI other than hyping it up is in his best financial interest. Were Musk serious, he wouldn't be late to the game with Grok and other products which barely perform even basic tasks.
I can assure everyone, as a software engineer with nearly two decades of experience--having used AI products at the enterprise level and directed my teams to do so over the last couple years--it isn't what they market it as. It's great for coding... if you don't know how to code, or need a basic concept explained to you. The reason everyone buys the hype is because influencers who make money by influencing, not working, sell it. The media sells it, because it gets eyeballs on pages and ears on podcasts.
Remarkable tools. Like Intellisense. It has already changed the way work is done. But AGI it is not. Complete automation its not. There's a reason that OpenAI and Nvidia only demo their shit in the most controlled of environments with extremely scripted and vetted interactions. It is performative, largely.
Also, I bet the reason that 75% of white collar workers use AI on a daily basis (per the article) is including all the people who have it forced upon them by Microsoft across their suite. Microsoft has injected copilot into practically every major service that a typical white collar worker uses, and some like the Power Platform where you absolutely pull your hair out trying to disable (you can't, not really). So yeah, people are "using it" because this stuff is likely built in and not able to turned off but the metric still counts. | r/technology | comment | r/technology | 2024-05-06 |
>newspapers just became websites
What exactly do you think they meant by online databases replacing newspapers? Do you even know what a database is? Every news site is just articles served from a database.
Virtually everybody has learned things on the internet that they would've had to learn from a teacher before the internet and the majority of people in non-democratic countries have learned things that they wouldn't have been able to learn without the internet.
You're also ignoring everything the article got blatantly wrong. The abstract paragraph alone mentions like 8 things that "visionaries" expect to be online in the future, the article calls BS, and literally all of them are regularly hosted online now. Every other paragraph got multiple things wrong as well. | r/technology | comment | r/technology | 2024-05-06 |
Well, "statistics" isn't really as sexy as "machine learning" or "artificial intelligence," so this is what we get. | r/technology | comment | r/technology | 2024-05-06 |
Yeah the whole "Everyone needs a 3d printer, you'll never have to order another custom part for your vacuum cleaner again!" crowd really under estimates the average person's ability to use 3d modeling software.
I would say I have an above average understanding of most things. I can use Photoshop, Gimp, etc and even just finding some tutorials or instructions on how to slightly modify a file from thingverse took a huge amount of time and effort for what I would consider something that was non-trivial.
It was very counterintuitive to just make a slightly larger hole and carve out a small section. Downloading already made stl files, easy but modeling your own stuff or even just editing them is a whole other specialty. | r/technology | comment | r/technology | 2024-05-06 |
If you saw somebody asking this question on a 20 year old forum you would laugh at them. Think about why that's the case and you'll have your answer. Computation doesn't scale 1:1 with power and power isn't finite. | r/technology | comment | r/technology | 2024-05-06 |
I don’t doubt that AI for protein folding can happen and will be important…but that’s *extremely* niche.
I think when people think of applications for AI, a huge part of that is based around assumptions and fears of it replacing entire industries or jobs. So far, we’ve mostly just seen low level content writers and illustrators lose work. It hasn’t quite been so transformative, for better or worse. When hysterical headlines hit about employment doom caused by AI, people who expressed caution about that panic seem to be proven right: AI isn’t really coming for our jobs. It’s a tool. Some tools end up causing efficiency and attrition, some employers go to an extreme to use a tool and fire workers, but for the most part, it’s going to be more like a computer-based spreadsheet or word processor that replaced analog versions, rather than something that completely takes over our world. | r/technology | comment | r/technology | 2024-05-06 |
I do the same thing. Sometimes I don't get results I like and I have to go do it myself, and other times I get results that I do like.
I was once mocking up an image of a sign on a wall in an office for something I was doing at work. I found the perfect image to use as the base, but it wasn't wide enough. I used generative AI to widen it and it did a great job.
Another time I found this great watercolor texture and I needed it to be a seamless pattern. The AI actually did a great job and made me a beautiful seamless pattern. Also worked nicely when I had another such texture and just needed more of it. Like to expand it.
One last big one that I remember is that after my mom passed away, we needed a decent photo of her and I had one of her at my wedding where she was clinging to my shoulder in a group picture. I remember I had cropped the image and then selected around myself until the AI to remove me, and it did a phenomenal job.
I've had many failures with it, but I don't see it completely as a bad thing if it's going to be a handy tool to speed up the process. | r/technology | comment | r/technology | 2024-05-06 |
Well it’s not just the fault of the “everyone can repair stuff people” have you ever tried repairing any modern equipment , shits made to break and be binned now days. | r/technology | comment | r/technology | 2024-05-06 |
Eh, my point is mostly that current “AI” doesn’t understand what it’s doing. It’s just doing it. And that greater context is a key differentiating factor. | r/technology | comment | r/technology | 2024-05-06 |
My guy, ChatGPT doesn't know what decimal places are. I asked it to generate a random decimal and tell me what digit was in the ten thousandth place, and it told me the digit in the millionth place. I spent 30 minutes trying to get it to work properly but it simply cannot in any way understand what it's saying. **It only imitates the form of an answer to your question.**
It often gets statistics questions I ask it wrong as well—it'll fail to recognize when a paired samples test is needed, for example, and apply the wrong statistics test or fail to adjust the population to account for standard error. This is stuff a basic class in inferential statistics (like, high school stats shit) would drill into your head. But "AI" cannot understand why a paired samples test would be needed without explicitly being told so. | r/technology | comment | r/technology | 2024-05-06 |
No its not. Its the penny stock companies slapping AI to anything remotely reasonable they have on offer that is being washed from the system. All fortune 500 companies are either developing or integrating AI to their systems and have large dedicated teams on R&D duty for AI opportunities. The sector already blew past 1 trillion and is in no way shape or form slowing down. The stupid articles you read are literally AI generated or supported. I am stunned people in 2024 still think Generative AI is autocorrect but better. | r/technology | comment | r/technology | 2024-05-06 |
This is really what gets me. I'm tired of being told that "I just don't get it". No, I *do* get it... I just don't have bullish ideological or financial biases which motivate my reasoning on the subject.
It's very possible that the technology will grow, but there's so little *real* indication that it'll cause any kind of technological revolution on the scale of the internet. It really *is* a gimmick. Even for its best use cases (AFAIK mostly for sorting through huge amounts of data that would take much longer for humans to do) it still makes errors, and even when it gets to the point that it makes errors on par with a human, it'll only be useful for a limited set of tasks. | r/technology | comment | r/technology | 2024-05-06 |
> There is a huge amount of difference between writing code for a computer to interpret and writing a financial report, memo, MOU, proposal, project case study, etc.
That difference is way smaller than you think. Computer languages are actually quite simple for LLMs, hence why GitHub Copilot was built so early. It's just that LLMs are not reliable, which is a problem that will exist in any real usage.
> If you write a 30 page memo quarterly with pre-set categories of information, then yes this makes that effort easier.
The question then becomes whether that memo was actually valuable to begin with.
Who's gonna read it and why don't they just read the LLM inputs?
All the concrete information in it has to be distilled from work that has been done before. That work would have to be done by a human. The result would also have to be double checked by a human, while the human is referencing the source data, because a LLM will just hallucinate fictional data in combination with the true data.
Mark my words, this will be tried by companies and it will make no difference what so ever to efficiency.
LLMs are fun for generating conversations, which could be cool for generating dialog in a game or something. Its real world use in the professional sectors will be very limited. | r/technology | comment | r/technology | 2024-05-06 |
"I wanted to charge too much for this ticket...but I can't because NFTs have magical powers that prevent it!"
These are completely unrelated topics. | r/technology | comment | r/technology | 2024-05-06 |
I will believe it when i see ngreedia inflated stocks coming down.
Meanwhile, we will get many more “A.I.” drilled in our ears. | r/technology | comment | r/technology | 2024-05-06 |
I’m not really sure what you’re not sure about. Im agreeing with you that blockchain tech hasn’t been as transformative as originally envisioned.
A lot of people see crypto as a scam, and for good reason, a ton (a majority) of crypto projects are basically scams. But there are a few that are genuinely trying to make a better solution to some issues we currently face. From decentralized data storage to giving people a way to transact without a government’s oversight. The issue is that most people are not technically skilled enough to interact with crypto safely. Self-custodying your own money is (shocker) very risky. I don’t see crypto ever having mass adoption.
That being said, big names in finance seem to have embraced it as a (very) high-risk, high reward asset class. | r/technology | comment | r/technology | 2024-05-06 |
Exactly. I was tempted to reply that the computer revolution is also losing steam. So is the internet one. And the calculator one.
My usual response: https://www.reddit.com/r/ChatGPT/s/dYiHqPIFiF | r/technology | comment | r/technology | 2024-05-06 |
I just used an llm to write a shell script faster than I could with just my own knowledge and googling. That script automated connecting to a few hundred servers to perform a basic check that otherwise was going to be done entirely manually per server.
It's not a very good factual information source but it's often a useful boilerplate text generator when paired with human review and tweaks. | r/technology | comment | r/technology | 2024-05-06 |
For me it looks like LLM's are approaching the first peak of [Gartner hype cycle](https://en.wikipedia.org/wiki/Gartner_hype_cycle). | r/technology | comment | r/technology | 2024-05-06 |
Wow that is so surprising... I can't believe people way overhyped an immature technology that barely does basic arithmetic! So shocked... | r/technology | comment | r/technology | 2024-05-06 |
r/technology | post | r/technology | 2024-05-06 |
|
r/technology | post | r/technology | 2024-05-06 |
|
Since the author couldn't find room to include "the 26 words that created the Internet" in the article, I include them below:
>*No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.* | r/technology | comment | r/technology | 2024-05-06 |
My apologies, you don’t seem to understand the core issue - if 230 goes away as OP article describes, regardless of whether or not duty of care immediately exists, places like Reddit would be immediately liable if someone posted a drawing or photograph they don’t have copyright for, or an illegal download link, etc. | r/technology | comment | r/technology | 2024-05-06 |
Companies are already liable for what they publish themselves. Instead, 230 is about whether they are considered the publisher (and therefore liable) for what an end user posts to Reddit, etc. | r/technology | comment | r/technology | 2024-05-06 |
No, I fully understand the ramifications. They just don’t bother me. Social media, or what we used to call Web 2.0, is fun and all but it’s also the cause for most of the disinformation that pollutes our society. No big loss there. | r/technology | comment | r/technology | 2024-05-06 |
Wait… do you mean that people would be held responsible for what they write/say/do on the internet?!? The horror… | r/technology | comment | r/technology | 2024-05-06 |
More specifically, it would mean that Reddit would be liable for what their users post. It would have potential advantages, sure. But it would also mean the end of places like Reddit, if Reddit was legally liable if someone posted a drawing or photo they didn’t own the copyright to, etc.
But regardless, if you don’t think that places like Reddit are good, why are you posting on Reddit? 😂 | r/technology | comment | r/technology | 2024-05-06 |
Reddit and other platforms are great. Let’s not confuse them with the lies, vitriol, and other garbage placed on them by their users. | r/technology | comment | r/technology | 2024-05-06 |
And you don't seem to understand that there are more options than it goes away or it stays as-is. Which is my point. | r/technology | comment | r/technology | 2024-05-06 |
Why does the phone company think it can sell phones to drug dealers? Why does Fedex think they can sell shipping to drug smugglers?
They don’t. Those companies are unaware when this happens, and when they are made aware of specific cases, they must take action. That’s how the Internet works and should continue to work IMO. | r/technology | comment | r/technology | 2024-05-06 |
Those forums would be liable for all user content too. No one would take a risk. | r/technology | comment | r/technology | 2024-05-06 |
Switch to where? Literally every forum would have to manually determine if posts would get them sued.
It would effectively kill any user generated content sites including old school forums. | r/technology | comment | r/technology | 2024-05-06 |
> Switch to where?
To non-US-based platform that does not target US market. | r/technology | comment | r/technology | 2024-05-06 |
Okay so you won't be able to access it in the US then.... | r/technology | comment | r/technology | 2024-05-06 |
Exhibit A:
https://www.reddit.com/r/FauciForPrison/s/sN60c8Kait | r/technology | comment | r/technology | 2024-05-06 |
That's not what this does. This would hold anyone who hosted anything like a forum online liable for anything anyone said.
It would mean the internet goes back to just company websites with no user content at all. The internet just turns into a slightly more usable yellow pages.
No Wikipedia, no YouTube, no user forums with helpful tips on how to do X or how to fix Y etc.
Just company curated pages effectively turning the entire internet into sanitized ads. | r/technology | comment | r/technology | 2024-05-06 |
[Precogs](https://minorityreport.fandom.com/wiki/Precogs) | r/technology | comment | r/technology | 2024-05-06 |
VPNs do exist. Even China and Russia have not managed to block all of them.
Unless the US physically disconnects from the rest of the world or goes full-North Korea, those resources will remain accessible. | r/technology | comment | r/technology | 2024-05-06 |
I get all of this. Ok. | r/technology | comment | r/technology | 2024-05-06 |
>lies, vitriol, and other garbage
These are legal forms of speech for the most part, how would removing section 230 mean someone is held responsible for them? | r/technology | comment | r/technology | 2024-05-06 |
Section 230 means they are only publishers of what reddit intentionally creates and is not liable for what users publish. | r/technology | comment | r/technology | 2024-05-06 |
That’s not how the law works though. If you remove the content once it’s posted you aren’t liable.
Also, tons of sites vet their users so they can reduce their moderation burden. Reddit doesn’t because they profit from user generated content.
The point of section 230 was that online publishers weren’t like the news or tv stations, they don’t have editorial control so they shouldn’t be held liable. But these social media companies factually do use algorithmic application to control what information people are delivered and they should, as a result, share they responsibility for
the things they host. | r/technology | comment | r/technology | 2024-05-06 |
It means most useful and enjoyable internet stuff will be hosted outside the US. People will still use web forums etc, they just will be non-US-based.
Long term the US would be relinquishing cultural soft power to the rest of the world. | r/technology | comment | r/technology | 2024-05-06 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.