My little 4 year old brother was on Scratch today and told me he saw 4 projects that had nude people or * in it. I'm not putting a link to those projects but the Scratch Team should do something. If someone is uploading a project and the project has the word * or nude in it it should not be able to upload to Scratch. I hope everyone on the Scratch team reads this. Thank You!
Last edited by INDY18 (2008-06-10 19:13:22)
Offline
I agree. Scratch is supposed to be kid friendly. If your brother saw what you said he saw, that is definately not kid friendly. But don't blame the Scrtach Team. They have a lot to do and it is hard to keep up with this kind of stuff. If a number of people flag it as innapropriate it will automatically be removed.
Offline
adriangl wrote:
I agree. Scratch is supposed to be kid friendly. If your brother saw what you said he saw, that is definately not kid friendly. But don't blame the Scrtach Team. They have a lot to do and it is hard to keep up with this kind of stuff. If a number of people flag it as innapropriate it will automatically be removed.
I Strongly Agree!
Offline
When you find a project with inappropriate content just flag it. Thank you. If you have any questions please contact us.
Offline
INDY18 wrote:
My little 4 year old brother was on Scratch today and told me he saw 4 projects that had nude people or * in it. I'm not putting a link to those projects but the Scratch Team should do something. If someone is uploading a project and the project has the word * or nude in it it should not be able to upload to Scratch. I hope everyone on the Scratch team reads this. Thank You!
I certainly agree with keeping projects depicting * or nudity off of the Scratch website. In reality, however, something much more sophisticated than a word search on uploaded projects will be required to accomplish this. Similar to the problems encountered when trying to keep spam and viruses out of your email, a system with a great deal of intelligence would be required to keep * and nudity out of the images, projects, project descriptions, project comments, and forum postings. Automatically detecting images depicting * and nudity is practically impossible using existing technology.
Probably the best system around for accomplishing this is the collective and combined intelligence of hundreds (thousands?) of scratchers who have the ability to immediately flag a project that contains offensive material. It is very difficult, if not impossible to write a computer program that can match the collective intelligence of hundreds of intelligent people working together to achieve a common goal.
By the way, if you want proof that a word search for * and nudity won't do the job, go to Google and search for the following keywords:
* site:http://scratch.mit.edu/forums/
When you do, you will get hits on more than 30 posts on the forums that mention the word *. Although I didn't check them all, I believe that most if not all of them would be false alarms insofar as inappropriate content is concerned. One of those posts was complaining about the galleries being sexist, which has an entirely different connotation than might normally be associated with the word * insofar an inappropriate content is concerned. (Granted, however, that the scratcher who created that post believes that there are many inappropriate galleries with text such as Boys Rule and Girls Drool, or something to that effect.) In a week or two, the post that you are reading now will also be included in the hits produced by Google on the keywords given above.
If you remove the word forums from the Google-search keywords, opening the search up to include the entire Scratch website, you will get many more hits. I checked a few of them out and found that with one exception, all of the hits that I checked led to projects that had been removed, presumably because they have been flagged as inappropriate. The one case that I checked that hadn't been dealt with was a comment on a rather innocuous project. It was the comment and not the project that was the culprit in this case. I flagged the comment as inappropriate and it immediately disappeared. On the other hand, there are some in this world who may not have found the comment inappropriate.
Remember, there very little on this earth that is as powerful as a large number of like-minded and people working together toward a common objective. Therefore, I predict that it will be very difficult to improve on the system of having members flag material as inappropriate.
Last edited by dbal (2008-06-11 03:22:37)
Offline
Dbal wrote:
If you remove the word forums from the Google-search keywords, opening the search up to include the entire Scratch website, you will get many more hits. I checked a few of them out and found that with one exception, all of the hits that I checked led to projects that had been removed, presumably because they have been flagged as inappropriate. The one case that I checked that hadn't been dealt with was a comment on a rather innocuous project. It was the comment and not the project that was the culprit in this case. I flagged the comment as inappropriate and it immediately disappeared. On the other hand, there are some in this world who may not have found the comment inappropriate.
Was that comment by any chance on one of my projects? The presidental thing?
Offline
Bobby500 wrote:
Dbal wrote:
If you remove the word forums from the Google-search keywords, opening the search up to include the entire Scratch website, you will get many more hits. I checked a few of them out and found that with one exception, all of the hits that I checked led to projects that had been removed, presumably because they have been flagged as inappropriate. The one case that I checked that hadn't been dealt with was a comment on a rather innocuous project. It was the comment and not the project that was the culprit in this case. I flagged the comment as inappropriate and it immediately disappeared. On the other hand, there are some in this world who may not have found the comment inappropriate.
Was that comment by any chance on one of my projects? The presidental thing?
I don't remember what project it was on, but it was an explicit sexual comment having nothing to do with the president.
Offline
dbal wrote:
Bobby500 wrote:
Dbal wrote:
If you remove the word forums from the Google-search keywords, opening the search up to include the entire Scratch website, you will get many more hits. I checked a few of them out and found that with one exception, all of the hits that I checked led to projects that had been removed, presumably because they have been flagged as inappropriate. The one case that I checked that hadn't been dealt with was a comment on a rather innocuous project. It was the comment and not the project that was the culprit in this case. I flagged the comment as inappropriate and it immediately disappeared. On the other hand, there are some in this world who may not have found the comment inappropriate.
Was that comment by any chance on one of my projects? The presidental thing?
I don't remember what project it was on, but it was an explicit sexual comment having nothing to do with the president.
Probably was mine then. I Made the project and... well it is a long story. I will just delete the comments pertaining to it.
Sorry
Offline
Bobby500 wrote:
dbal wrote:
Bobby500 wrote:
Was that comment by any chance on one of my projects? The presidental thing?
I don't remember what project it was on, but it was an explicit sexual comment having nothing to do with the president.
Probably was mine then. I Made the project and... well it is a long story. I will just delete the comments pertaining to it.
Sorry
I don't know what I said that would cause you to conclude that it was your project. Please don't delete the comments on your project on my account. As I recall it was an explicit sexual comment made by someone who was being negatively critical about the project. The project was innocent of all blame, so if it was your project, you are innocent. The blame falls squarely on the very rude scratcher who posted the comment on the project. And I saw only one inappropriate comment among the several comments that had been posted.
Last edited by dbal (2008-06-11 11:47:40)
Offline
Fortunately, I have not ran into sexual content on Scratch yet. I'm sorry he was exposed to it.
Ok, I'm trying to understand how your suggestion would work dbal.
1. Someone flags what they deem an inappropriate project, and writes a comment like "it has a bad picture"
2. The website extracts all the costumes, (and text, audio?) in the project and displays thumbnail of them along with the project in a special project view, that only adult volunteers/moderators can see. (So hidden pornography can be detected.)
3. Adult volunteer users and scratch team members are subscribed to this "flagged" project list. On their next page load, a thumbnail of the project is shown to them at the top of the screen...
4. When clicked. they are taken to the special project view page:
5. The adult volunteer, after reviewing the project and costumes can hide/delete the project immediately. (perhaps tag the project with the acting moderators username so actions and success can be tracked?)
I think something like that may work the best. (it might not be far off from how it works already.)
Other thoughts off the top of my head:
There is a alpha GPL nudity detector. (If you can't find it, contact me.) Perhaps scratchr could extract the costumes and run it through something like this. It might make many false positives and headaches though.
Perhaps run an md5 on all image data in projects and catalog them, and list every project that image is in. A safeness rating could be assigned to each image based on the exposure of the projects it's in... each new costume could be displayed to a moderator in a gallery view showing hundreds of these, if it was not marked, it is deemed safe.
Also, If an image deemed unsafe, shows in a project again, the project is immediately flagged/hidden/deleted. (That might be a huge mess too though...)
Last edited by AddZero (2008-06-11 12:39:14)
Offline
AddZero wrote:
Fortunately, I have not ran into sexual content on Scratch yet. I'm sorry he was exposed to it.
Ok, I'm trying to understand how your suggestion would work dbal.
1. Someone flags what they deem an inappropriate project, and writes a comment like "it has a bad picture"
2. The website extracts all the costumes, (and text, audio?) in the project and displays thumbnail of them along with the project in a special project view, that only adult volunteers/moderators can see. (So hidden pornography can be detected.)
3. Adult volunteer users and scratch team members are subscribed to this "flagged" project list. On their next page load, a thumbnail of the project is shown to them at the top of the screen...
4. When clicked. they are taken to the special project view page:
5. The adult volunteer, after reviewing the project and costumes can hide/delete the project immediately. (perhaps tag the project with the acting moderators username so actions and success can be tracked?)
I think something like that may work the best. (it might not be far off from how it works already.)
Other thoughts off the top of my head:
There is a GPL beta nudity detector. (If you can't find it, contact me.) Perhaps scratchr could extract the costumes and run it through something like this. It might make many false positives and headaches though.
Perhaps run an md5 on all image data in projects and catalog them, and list every project that image is in. A safeness rating could be assigned to each image based on the exposure of the projects it's in... each new costume could be displayed to a moderator in a gallery view showing hundreds of these, if it was not marked, it is deemed safe.
Also, If an image deemed unsafe, shows in a project again, the project is immediately flagged/hidden/deleted. (That might be a huge mess too though...)
Since I don't know what goes on behind the scenes for flagged projects, I won't try to suggest how the staff should handle them.
However, unless inappropriate material on the site becomes a serious problem, I would avoid the temptation to over-automate the process by applying such things as automatic nudity detectors and MD5 hashcode libraries for images. I would simply continue to rely on the judgment of the scratcher population and existing staff procedures.
If something like automatic nudity detectors and MD5 hashcode libraries for images does become necessary, maintenance of the overall Scratch project may become overwhelming for the project staff at MIT. I sincerely hope that doesn't happen because such a situation could lead to the collapse of the entire project, depending of course on the level of financial support that is available. Only the members of the staff can decide how far they need to go in this regard.
Last edited by dbal (2008-06-11 12:47:54)
Offline