Sciencemadness Discussion Board
Not logged in [Login - Register]
Go To Bottom

Printable Version  
Author: Subject: Anyone scrape/save Youtube video comments? I need a good method
International Hazard

Posts: 926
Registered: 16-6-2014
Member Is Offline

Mood: No Mood

[*] posted on 22-6-2018 at 17:08
Anyone scrape/save Youtube video comments? I need a good method

I download YT vids that are good and often the comment sections are full of very useful information but I haven't found a good program that will save the comments, and IDK what format would be best. I've seen some that save in JSON or CSV and I guess these can be imported into other programs but it is still hard to view them.

Anyone come across this issue and found a good solution?
View user's profile View All Posts By User
International Hazard

Posts: 760
Registered: 17-1-2013
Location: Carrboro, NC
Member Is Offline

Mood: anomalous

[*] posted on 22-6-2018 at 19:21

What does ease of reading mean here? Most spreadsheet software will open a CSV without problem; I just did it in LibreOffice Calc. They can also be easily processed on the command line, including using tools like tr to convert commas to whitespace for less cluttered reading.

al-khemie is not a terrorist organization
"Chemicals, chemicals... I need chemicals!" - George Hayduke
"Wubbalubba dub-dub!" - Rick Sanchez
View user's profile Visit user's homepage View All Posts By User
Super Moderator

Posts: 3845
Registered: 4-10-2014
Location: Oz
Member Is Online

Mood: Metastable, and that's good enough.

[*] posted on 22-6-2018 at 19:29

Chemplayer said he archived his but I don't know how he did it.

A little shameless self-promotion: New stuff on the YT channel. 100 sub celebration. Or you can tour my lab.
View user's profile View All Posts By User
International Hazard

Posts: 2847
Registered: 15-10-2015
Location: Western Hemisphere
Member Is Offline

Mood: :cool:

[*] posted on 22-6-2018 at 20:54

YouTube has an API, and it allows retrieving comments:

It would be easier and more elegant to use the API than to scrape. Practically every programming language can parse JSON.

This is my YouTube channel: Extreme Red Cabbage. I don't have much posted, but I try to do nice writeups once in a while.
View user's profile View All Posts By User
International Hazard

Posts: 809
Registered: 20-4-2005
Location: Netherlands
Member Is Offline

Mood: Mood

[*] posted on 22-6-2018 at 23:58

I would go for the API, with for example Postman you have a nice GUI for calling API's, and the JSON can be made readable with code editors like e.g. Brackets (with a beautify plugin).
View user's profile View All Posts By User

Posts: 27
Registered: 25-4-2018
Member Is Offline

Mood: No Mood

[*] posted on 23-6-2018 at 05:31

I sometimes collect some dynamic stuff like news, comments, fear that somebody would die or dissappear or ban or stop providing content/electricity/internet...for example if somebody becomes sick or angry or tired or sad or arrested or no longer interested. There are many youtube channels which got banned and many webpages dissappeared or siezed by the fbi.

I don't know what's wrong with doing that all manually. I saved...let me count...6683 pages as mhtml files manually, from 432 domains (websites). Plus csv is so standard and small size and fast. I use nirsoft csvfileview as viewer, and make most csv manually using notepad++, simply copy full first line and modify each word with your content in new most cases need to put "" for example for youtube comments or large texts.

Microsoft's and various offices are too slow to open csv file, and are worth to use only if you use less larger file, not vice versa.

Hell, if you don't have time to do it all manually, then you it means don't have time to read it all later.

Hell, I don't know what API is, sound like something complicated for programmers. I will only become programmer if i simplify those complications.
View user's profile View All Posts By User

  Go To Top