Go Back   I-Mockery Forum > I-Mockery Discussion Forums > Philosophy, Politics, and News > Google is Big Brother?
FAQ Members List Calendar Today's Posts

Thread: Google is Big Brother? Reply to Thread
Title:
Message
Image Verification
Please enter the six letters or digits that appear in the image opposite.


Additional Options
Miscellaneous Options

Topic Review (Newest First)
Jul 15th, 2003 08:25 AM
Pub Lover
Quote:
Originally Posted by Sethomas
my website also gets a disproportional amount of visitors from Portugal and New Zealand.
Yeah, it's funny how your site made it into my daily routine.

No, not funny, sad, very, very sad.
Jul 13th, 2003 09:28 PM
kahljorn They didn't count multiple postings of the same site, that happens often.
Jul 13th, 2003 09:26 PM
Zero Signal This is really interesting, too.

http://www.google-watch.org/broken.html



Quote:
Let's speculate. Most of Google's core software was written in 1998-2000. It was written in C and C++ to run under Linux. As of July 2000, Google was claiming one billion web pages indexed. By November 2002, they were claiming 3 billion. At this rate of increase, they would now be at 3.5 billion, even though the count hasn't changed on their home page since November. If you search for the word "the" you get a count of 3.76 billion. It's unclear what role other languages would have, if any, in producing this count. Perhaps each language has it's own lexicon and it's own web page IDs. But any way you cut it, we're approaching 4 billion very soon, at least for English. With some numbers presumably set aside for the freshbot, it would appear that they are running out available web page IDs.

If you use an ID number to identify each new page on the web, there is a problem once you get to 4.2 billion. Numbers higher than that require more processing power and different coding. Our speculation makes three major assumptions: a) Google uses standard functions for the C language in their core programming; b) when Google's programs were first developed four or more years ago, a unique ID was required for every web page; and c) it seemed reasonable and efficient at that time to use an unsigned long integer in ANSI C. In Linux, this variable is four bytes long, and has a maximum of 4.2 billion before it rolls over to zero. The next step up in numeric variables under Linux requires different standard functions in ANSI C, and more CPU cycles for processing. When the core programs were developed for Google several years ago, it's reasonable to assume that the 4.2 billion upper limit was not seen as a potential problem.
Jul 13th, 2003 09:19 PM
Sethomas Judging from traffic to my site, it'd seem that closer to 95% of all search referrals come from Google, not 75% as stated by the report. But then again, my website also gets a disproportional amount of visitors from Portugal and New Zealand. Personally, none of the points made against them seem to really infringe upon modern corporate ethics, though it doesn't seem fiscally beneficial as things are for them to store that much data indefinitely. That raises some curiosity, but they still seem to be a long way away from doing any harm by it.
Jul 13th, 2003 09:11 PM
Zero Signal Fantastic.

"Matt Cutts, a key Google engineer, used to work for the National Security Agency. Google wants to hire more people with security clearances, so that they can peddle their corporate assets to the spooks in Washington."

Jul 13th, 2003 08:54 PM
Jeanette X
Google is Big Brother?

http://www.google-watch.org/bigbro.html

Posting Rules
You may not post new threads
You may post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off

   


All times are GMT -4. The time now is 04:29 PM.


© 2008 I-Mockery.com
Powered by: vBulletin
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.