mozilla's extension was maf and a total mess..it doesn't work now, not sure if it ever did..i tested it a while back on ff1.5 and 2.0 and the only thing that worked was opening/viewing mht.. i think it was supposed to use a zip compression, because like gunter said it had zip dlls and xpts for the component folders
IZipWriterComponent.xpt
ZipWriterComponent.dll
i'm thinking of something similar(like the mafuka audio approach) zipping the files and for viewing; extracting them to temp without the user knowing and sending the index to kmeleon.
so far, i was successful in viewing the file by using a hexed 7z dll(kma) and using it to extract to the temp folder to a specified folder(temp\unkma)..autoitscript finds the main htm files and sends it kmeleon without problems...images and other files for the site and preserved in their structural folders that were extracted so kmeleon was able to open the page with its required files without problems.
now i'm facing problems with save.. i first used wget(originally for linux but there's a window binary now)i could pass all the parameters to wget. wget saves the page to a specified folder..and autoit waits till its done then passes the folder to hexed 7z and compresses the folder with kma extension..no problem there but wget has another big problem.. wget is great when saving the top page(www.google.com) but when you go to sublevels of a site(google.com/crap/shit/foruum/main.php)..especially when the main page is not an html(like asp or php..like in forums) wget messes the folder structure and fails to create an index page.
so what i need is to make kmeleon not wget save the page(web complete).. and it should be from autoit not macro so autoit controls the file and folder name path to compress later..if save is done from macro then autoit can not find out where the page is saved or what name..so i need to know if it's possible to make kmeleon save a page(complete) from the command line
yes panzer..i'm biting my tongue