Authored by: Anonymous on Saturday, May 25 2013 @ 08:07 PM EDT |
Larry Page should be in with the Scripps reporters. They couldn't
have done whatever they did without Google's indexing the docs.
Never mind, I s'pose some lawyers will get a meal or two out of
this exercise in stupidity, before the light dawns ...
[ Reply to This | Parent | # ]
|
|
Authored by: Anonymous on Saturday, May 25 2013 @ 09:16 PM EDT |
A couple of days I read an email asking about a new, hitherto
unknown hacking tool that appears to be called "wget". Somebody
was using it to hack into his client's website, and retrieving
thousands of confidential company records.
I merely shook my head in wonder, because of the ignorance
about fundamental, long established tools that the writer --- who
claims to be professional forensic data recovery specialist ---
displayed.[ Reply to This | Parent | # ]
|
|
Authored by: bugstomper on Saturday, May 25 2013 @ 11:01 PM EDT |
PJ said in her note in the News Pick article:
"I checked terracome.com and there is no robots.txt file"
PJ, Was that an unauthorized search of the terracome.com web site? You entered a
specially crafted URL designed to extract information from the web site that it
obviously was not intended to provide (as evidenced by nothing being there). A
clear violation of the CFAA! :)
[ Reply to This | Parent | # ]
|
|
Authored by: Anonymous on Sunday, May 26 2013 @ 01:54 AM EDT |
'Reporters use Google, find breach, get branded as
“hackers”' (arsTechnica article, 21 May 2013)
Like many other
savvy web surfers, I use wget on a regular basis. It's very useful as a
backup to FireFox's download capability. When a download aborts more than once
in FF I switch to wget which is more tolerant of network hiccups. Plus it
uses a LOT less RAM.
TerraCom and YourTel seem to be clueless
about how opening one's kimono online may subject one's navel to a form of lint
checking.
"Wget
Manual - Examples" (Lars Appel's Wget Manual page)
"GNU Wget 1.13.4
Manual" (GNU.org manual page)
[ Reply to This | Parent | # ]
|
|
Authored by: Anonymous on Sunday, May 26 2013 @ 04:19 AM EDT |
"Because of that, Wget honors RES[Robots Exclusion Standard] when
downloading recursively." -- gnu wget manual
If they weren't using the recursive download features, but rather using a
separate search script that called wget for the downloads, it would not honor
any robots.txt file exclusions (had it been present). This could be considered
hacking, depending on the complexity of the script (encrypting page names,
falsifying session identifiers, etc). It would still, however, require the site
security to be the equivalent of a spiderweb.[ Reply to This | Parent | # ]
|
|
Authored by: tiger99 on Sunday, May 26 2013 @ 04:29 AM EDT |
It just does what anyone sitting at their PC with a web browser can do, except
that it is automated and scriptable, and omits the visual preview of what you
want to save. In other words, some automation is replacing actions which any
person can perform. Now how would that be illegal? [ Reply to This | Parent | # ]
|
|
Authored by: JamesK on Sunday, May 26 2013 @ 11:02 AM EDT |
Perhaps they should also ban things like hammers and crowbars because they can
be used to break into homes & businesses. Never mind that they have long
been useful tools with many legitimate purposes.
---
The following program contains immature subject matter.
Viewer discretion is advised.[ Reply to This | Parent | # ]
|
|
Authored by: Anonymous on Sunday, May 26 2013 @ 03:57 PM EDT |
http://groklaw.net/robots.txt [ Reply to This | Parent | # ]
|
|
Authored by: Anonymous on Sunday, May 26 2013 @ 05:50 PM EDT |
Here's one of the plaintiffs' side of the story. Note that there
is a link to this from their home page.
http://www.yourtelamerica.com/security-notice/
Not clicky so you can practice l33t h4x0r sk1lz
It is left as an exercise for the gentle read^H^H^H^Hhacker
to examine their robots.txt and weep or smile ...
[ Reply to This | Parent | # ]
|
|