Website Capture App...
 
Notifications
Clear all

Website Capture Applications

14 Posts
13 Users
1 Likes
3,946 Views
Jesterladd
(@jesterladd)
Posts: 28
Trusted Member
Topic starter
 

Dear Gurus

I was in a discussion today with a fellow colleague about website capture applications. Six years have passed since we last captured a website for evidence. We would be interested to know which website capture apps people use and what was felt to be the best one.

Thank you for your assistance.

Jesterladd

 
Posted : 07/07/2009 8:37 pm
(@mindsmith)
Posts: 174
Estimable Member
 

Good question I would also be interested to know is there are any 'verifiable' and time-efficient tools for such purposes.

On an aside In the past the amount of time to download an entire site (remotely - 9.5GB on one website recently) has sometimes been an issue so to document the contents/view of a website I have used Camtasia with lost of care talen to show & prove that it is the actual website in question and to open and browse all files and directories. This has helped in persuading non-technical people about what was found, how it was accessed, and that the contents are 'as is' .

 
Posted : 08/07/2009 12:14 am
(@rampage)
Posts: 354
Reputable Member
 

for capturing web sites you can use wget in a recursive way, and log the sessions with wireshark, this way you have both the web site content and the server messages suitable for timestamp and non-deniability of the source of the content.

 
Posted : 08/07/2009 12:15 am
(@gkelley)
Posts: 128
Estimable Member
 

I haven't used it but Webcase was designed with computer forensics in mind. http//www.veresoftware.net/

 
Posted : 09/07/2009 6:49 am
(@kovar)
Posts: 805
Prominent Member
 

Greetings,

- wget and httrack for mirroring entire sites.
- Adobe Acrobat Pro for turning entire sites into PDFs.
- Screen/video capture for recording sites that use a lot of java/flash.

The first two approaches do not handle java/flash/user interaction well, thus the third option.

-David

 
Posted : 09/07/2009 8:00 am
(@the_wodon)
Posts: 1
New Member
 

As well as having Httrack for live websites, I have found Warrick useful for capturing websites that have been removed by mirroring the archive.org, google and yahoo caches.

Not perfect but useful in a pinch as archive.org uses some odd javascripting.

http//warrick.cs.odu.edu/

Paul

 
Posted : 09/07/2009 5:31 pm
(@gkelley)
Posts: 128
Estimable Member
 

Website Ripper/Copier is another tool that is helpful that I have used.

 
Posted : 09/07/2009 6:06 pm
(@domus)
Posts: 1
New Member
 

There are quite a few tools that can do this. Although one good online one is GrabzIt's Web Scraper. This web scraper can convert every web page in a website to PDF. They have useful tutorial on how to get started converting an entire website to PDF.

They also provide the ability to add a timestamp watermark to the PDF, which may be useful for some use cases.

 
Posted : 15/04/2019 7:44 am
marky.mark
(@marky-mark)
Posts: 22
Eminent Member
 

Hello,

If you need a tool that preserve integrity with hash value of extracted websites, you can use hunchly. This tool save every visited web pages as you are browsing in google chrome. You can export the pages visited in MHTML or pdf format. As i said the tool keep every hashing value of the pages consulted and the exported pages. After the investigation hunchly will build you a report of the investigative work you have done. It also enable you to see metadata of the images you would find on the web/social networks.

This is a professional tool used by many law enforcement team and the creator Justin Seitz is easy to join and to talk to.

You should give it a try, there is a trial on their website.

https://www.hunch.ly/

 
Posted : 15/04/2019 1:29 pm
roncufley reacted
 dega
(@dega)
Posts: 261
Reputable Member
 

there's also the italian FAW https://www.fawproject.com/

 
Posted : 15/04/2019 2:42 pm
Page 1 / 2
Share: