Win a copy of Pro Spring MVC with WebFlux: Web Development in Spring Framework 5 and Spring Boot 2 this week in the Spring forum!
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Ron McLeod
  • Paul Clapham
  • Jeanne Boyarsky
  • Liutauras Vilda
Sheriffs:
  • Rob Spoor
  • Bear Bibeault
  • Tim Cooke
Saloon Keepers:
  • Tim Moores
  • Stephan van Hulst
  • Tim Holloway
  • Carey Brown
  • Piet Souris
Bartenders:
  • Frits Walraven
  • Himai Minh

AI and the Media

 
Ranch Hand
Posts: 789
Python C++ Linux
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Nick Bostrom's talk got me thinking about how much the machines might already be in control with us unaware of it.

Many news articles today are written by computer, and the reader is unaware of it. The news organizations measure success by traffic. When the system sees traffic on a story the algorithm should generate more of the same. It becomes more widespread and reaches more people, generating even more traffic... When I see quick, massive changes in public opinion on something, like I've seen in the last few days, I wonder if something like this isn't behind it. So essentially we're doing the machine's bidding. It may not be trying to benevolently or malevolently steer our lives, but it is feeding itself, with changes in public opinion being waste product, not its goal.
 
Java Cowboy
Posts: 16084
88
Android Scala IntelliJ IDE Spring Java
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Guillermo Ishi wrote:Many news articles today are written by computer, and the reader is unaware of it.


What do you mean, that the computer itself writes a news article, and people just publish whatever the computer says without even checking it? I've never heard of something like that.

But one other instance where computers do all of the work, and where it can go wrong, is in high-frequency trading. People build computers and program algorithms to very quickly buy and sell stocks. This can lead to a crash of the stock market, as happened in May 2010.
 
Bartender
Posts: 2407
36
Scala Python Oracle Postgres Database Linux
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I tend to assume that:

1) People are idiots (me included).
2) Machines are idiots (even the "intelligent" ones).

I find these basic principles allow me to discount much of what goes on as the inevitable consequences either (1) or (2) above.
 
Guillermo Ishi
Ranch Hand
Posts: 789
Python C++ Linux
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Jesper de Jong wrote:
What do you mean, that the computer itself writes a news article, and people just publish whatever the computer says without even checking it? I've never heard of something like that.



Here's an article from this month, "The AP now publishes 3,000 robot written articles per quarter "
http://www.threadwatch.org/node/32334
 
chris webster
Bartender
Posts: 2407
36
Scala Python Oracle Postgres Database Linux
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I don't doubt that a fair amount of boilerplate "churn-alism" is computer generated, but so what? There still has to be some original material to feed into the generators, although there are certainly going to be feedback loops as well. All the hoo-hah about Google Translate (another AI success story) tends to ignore the key fact that it only works because humans generate the original content that it can use for its statistical analysis. No "intelligence" there, just a lot of number-crunching that is ultimately about encouraging you to look at Google sites, feed their machines with extra data, and consume the adverts they're paid to sell. This is true of a lot of the AI hype we see in the tech press, and our industry must surely be the most over-hyped in history.

More importantly, Sturgeon's Law applies i.e. 90% of everything is crap. Personally, I don't much care if online content is generated by humans or machines, because it's mostly crap anyway. People have been generating crap for centuries, and most "journalism" has long been a way of filling gaps between ads, driven by what sells and what the proprietor wants. People already swallow this stuff without much thought as to whose interests are served by the lead stories. Which brings us to the question of who's in charge.

Machines may be generating "content", but they're not in control, because they are being operated by and for people and organisations who want your attention and your dollars. Manipulation of public opinion, the markets and pretty much anything else has always been exploited by the powerful to ensure that they retain and accrue as much wealth (= power to grab more wealth) as they can. Follow the money and you'll find the most important feedback loop in the process: it's all about the money, and it always has been.

So you're right, there are lots of new tools to automate a process of moving wealth from poor to rich that has always existed, but they are still ultimately serving the goals of their owners, because that's what they're for in the first place. So, for example, flash-crashes on computerised algorithmic trading are a problem, but there are already moves afoot to limit these systems, because the people who benefit from them want to know that their wealth is not going to be wiped out in a flash. We've had high tech nuclear weapons for 70 years, but the reason we haven't had nuclear Armageddon is that powerful people realise it's not in their interests, so they limit access to nukes and (generally) manage them more carefully. So I'm sure we'll see plenty more examples of feedback problems, weird market movements, swings in public opinion, malfunctioning autonomous systems (battle robots, anybody?) and so on. But they will still mainly be about serving the interests of the powerful, and the rich will act to secure their interests if these things threaten the status quo. In the end, I think real-world power will trump processor power every time.

And this is a First World Problem anyway: all this talk of the Singularity and billions of people still don't have access to clean water or reliable energy. If Ray Kurzweil and his tech-boosting ilk want to solve a Hard Problem, they could start there.
 
Guillermo Ishi
Ranch Hand
Posts: 789
Python C++ Linux
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

chris webster wrote:I don't doubt that a fair amount of boilerplate "churn-alism" is computer generated, but so what? There still has to be some original material to feed into the generators, although there are certainly going to be feedback loops as well. All the hoo-hah about Google Translate (another AI success story) tends to ignore the key fact that it only works because humans generate the original content that it can use for its statistical analysis. No "intelligence" there, just a lot of number-crunching that is ultimately about encouraging you to look at Google sites, feed their machines with extra data, and consume the adverts they're paid to sell. This is true of a lot of the AI hype we see in the tech press, and our industry must surely be the most over-hyped in history.



Lots of good points there. But the translate example is a bad one really; obviously it depends human input because it working with (I should say analyzing) human language. Now the feedback loops, yes. That is were it's at.

Without trying to define intelligence...browse kaggle.com which is a data science site and look at some of the projects people are working on in predictive analysis. Intelligence or not, the algorithms and processing power exist to cultivate clicks via news stories on a scale which manipulate public opinion randomly; whatever gets clicks. If is isn't being done, it would not be hard to get going in a rudimentary way to start. However, I am probably the only person who could turn it into a money losing proposition... But it's too nefarious and I wouldn't want to do it. At least not let it loose in the wild.

Regarding the power elite, I would not trust them to keep the computers away from us. They would probably originally turn them loose on us. Do not know if they would ever become subject to them themselves unless say we passed laws to that effect. Or perhaps in the future it could be the desirable thing, no laws needed.

But really -
the topic was about what is possible in the present, and about something that could be happening now, almost certainly is to some extent, that we are unaware of.
 
Consider Paul's rocket mass heater.
reply
    Bookmark Topic Watch Topic
  • New Topic