Search Results

Search found 58272 results on 2331 pages for 'apache log files'.

Page 257/2331 | < Previous Page | 253 254 255 256 257 258 259 260 261 262 263 264  | Next Page >

  • Microsoft Outlook: Export list of currently opened PST files

    - by ultrasawblade
    At my current workplace we are upgrading various users from XP to Windows 7. Frequently the users have anywhere from 10 to 30 or so .pst files opened within their installation of Microsoft Outlook 2007. These users are particularly helpless without these files. I know how to view the list of currently opened PST files, and would like to know if there is an easy way to capture that information other than taking screenshots of the Options - Data Files window. Does migwiz.exe transfer this information? Is that the only way? Would there happen to be a tool that will let you capture and restore that information? I don't want to export or move the actual .pst's themselves (yes, some of them are on network locations, very terrible, I know), just reopen ones in a new installation of Outlook that used to be opened in a previous installation.

    Read the article

  • Troubles with apache and virtual hosts

    - by xZero
    I have a BIG problem. I have VPS with Debian OS, and LAMP installed. Fresh install. For control panel i using Webmin. Now i trying to setup multiple sub-domains on my server using webmin for example: downloads.my-domain.com cpanel.my-domains.com forum.my-domains.com But problem what is happening is next, while i using no virtual hosts, everything works perfectly when i accessing it using my-domain.com, but when i add Virtual host, i cann access it but my-domain.com becomes unavilable because it redirects to virtual hosts i added. When i add more than 2 virtual hosts, problem is still here. Also, when i try to access to virtual server for example downloads.my-domain.com, it redirects again to cpanel.my-domains.com When i delete virtual hosts, access to my-domain.com is succesfull... What i known: - That is not problem with my domain provider. I correctly added subdomains and added host record to my VPS IP. - I added unique name to every single virtual host. - There are no two same virtual hosts - Every virtaul hosts have own directory: for example: downloads.my-domain.com have own WWW dir: /var/downloads Can somebody help me? Thanks.

    Read the article

  • Speed-up large number of files deletion on NTFS volumes

    - by sharptooth
    Every now and then I need to delete a folder containing something like 500k files from an NTFS volume. I do this with Windows Explorer. Since NTFS journals all the service data changes each deletion is carried out serially and so the whole 500k files deletion takes ages. I remember when I did the same in FAT32 it ran uncomparably faster. Is there any way to speed up deletion of large number of files on NTFS volumes?

    Read the article

  • How to download batch of randomly named files

    - by TheLearner
    I need to download a bunch of files from a website but I don't want to have to click on each file to add it to downloadall or whatever. The structure of the website is as follows: http://something.com/katalog/?get=Exclusive/group1/2012.09.03/ The directory has loads of randomly named files with .doc extensions. I can't use the batch feature because the files don't start or end with the same characters e.g. 001...100. Any ideas/

    Read the article

  • Apache directory access with virtual host

    - by alexeygaidamaka
    I have a virtual host with a configuration like that. When i'm trying to get into foobar.com/dir providing valid username/password pair i get 403 forbidden page instead of that directory contents. www.foobar.com/dir has 777 rights, .httpaswd is chmoded 644. But i can't figure out why i am still not seeing contents. Please, give me a hint. ServerAdmin webmaster@localhost ServerName www.foobar.com ServerAlias www.foobar.com DocumentRoot /var/www/foobar <Directory /> Options FollowSymLinks AllowOverride All </Directory> <Directory /var/www/foobar> Options -Indexes FollowSymLinks AllowOverride All Order allow,deny allow from all </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> <Directory /var/www/foobar/dir> AllowOverride AuthConfig AuthName "Authorize yourself, please!" AuthType Basic AuthUserFile /etc/apache2/.htpasswd AuthGroupFile /dev/null Allow from All Order Allow,Deny Require valid-user

    Read the article

  • Program keeping encrypted files.

    - by Giorgi
    I am looking for a program which will encrypt files specified by me and allow me to view/edit/delete those files without creating a virtual disk. I do not want to have virtual disk as a domain administrator can access it so truecrypt is not the possibility. One possibility is to use winrar with password protected archive but winrar serves a different goal so it is not very user friendly for this purpose. If it's possible it would be nice if the program does not creates temp files while I open the files. Any suggestions?

    Read the article

  • Can not copy files from Windows 2003 server over network

    - by Mark
    It seemed quite strange. I have a share folder with full read/write permission on my Windows 2003 server. With a XP client, I can create a new folder on the share folder, and I can copy files to it normally, but I can not copy these files back to my client PC. I tried use ftp,and webdav to get the files from server. None of them worked. Is the issue related with NETWORK SERVICE? Thanks for your help.

    Read the article

  • how to make vhdresizer work on XP Mode VHD files

    - by A_M
    Hi, I'm trying to shrink an Windows 7 XP Mode VHD file with little success. I've been trying to use VhdResizer. When I select my VHD file, it says "VhdExpand only supports fixed and dynamic VHD files". My XP Mode VHDs are dynamic files. Does anyone have any idea why it is failing? Failing that, does anyone have a process which I can use to shrink my XP mode VHD files? Thanks.

    Read the article

  • Reducing memory for worker MPM in Apache

    - by ShyM
    I've moved from the prefork MPM to the worker MPM due to a process limit I was hitting on my VPS. However, memory usage increased after switching over (which is odd since the worker MPM is supposed to have a smaller memory footprint?). Most of them belong to php-cgi processes. Is there something I'm doing wrong? I have around 20 sites on it, each with a different fcgi wrapper script. Could that be a reason?

    Read the article

  • retain last used path to location for saving files in Windows 7

    - by Mark Miller
    I am using Microsoft Office 2010 and Windows 7 on a Dell PC. I am opening a bunch of MSWord files one at a time, copying data tables therein, pasting the data into Excel and saving the Excel files as comma delimited text files. I am creating a separate Excel file for each MSWord file. The path to the folder containing the saved comma-delimited files is quite long, something like this: c:\users\me\aa\bb\cc\dd\ee\ Every time I open Excel and save a new comma-delimited file I have to re-navigate the entire path (c:\users\me\aa\bb\cc\dd\ee). In the past Windows seemed to remember the last used path, saving a lot of tedious key-strokes. In fact, I think Windows did this for me as recently as last week, albeit on a different computer. Can I apply a setting in Windows somewhere asking it to offer the last used path as a default when saving files so I do not have to re-navigate the entire directory structure to save each new comma-delimited file? If I can, how so? Where is the option for specifying that setting? Thank you for any help.

    Read the article

  • Dropbox picture sync: Skip RAW files?

    - by Steven Lu
    I like the convenience of having Dropbox keep track of my photos because it tends to work with my devices over 3G (I am often tethering to my phone with my iPad and Macbook) as well as Wifi, but it's a waste of network traffic to sync the raw files from my camera or memory card. It clutters up the dropbox list and the files are just huge. Is there a way to configure the Dropbox client so that it ignores a certain file extension for the picture sync? Also, I suspect that if I just go and delete the raw files, that the next time I plug in the memory card and tell Dropbox to sync, it will re-download the raw files. Which would be terribad. I could switch to iCloud for Photo Stream, I suppose, but there will be no access via 3G that way. And I've already got years of experience with Dropbox so I know it's going to just work. I think any method that works for filtering files to exclude from sync on Dropbox in general should work here too. Edit: Wow there are 19k votes for this exact request.

    Read the article

  • can't use periods in ServerName/ServerAlias [Lion Apache installation]

    - by punchfacechamp
    I can access my host like this… http://keggyshop but can't use periods… http://keggyshop.edu here's my virtual host directive… <VirtualHost *:80> ServerName keggyshop ServerAlias keggyshop.edu DocumentRoot "~/sites/2012/keggy/web/pages/keggy/120528/sandbox/public" <Directory "~/sites/2012/keggy/web/pages/keggy/120528/sandbox/public"> Options Includes FollowSymLinks AllowOverride All Order allow,deny Allow from all </Directory> </VirtualHost>

    Read the article

  • Windows xp - recover document opened directly from IE

    - by Thingfish
    Hi Attempting to help a family member recover a document. The word 2007 document was downloaded and opened directly from a webmail interface using Internet Explorer running on Windows XP. The user saved the document multiple times while working on it for the good part of a day. After closing Word 2007 the user is not able to locate the document, and I have so far not been able to help. The computer has not been turned off, and the user has not attempted to open the document directly from the mail again. Recreating the events on vista/windows 7 its easy enough to locate the document under the Temporary Internet Files folder. I have however not been able to do the same on a Windows XP. Any suggestion for how to locate this document, or if its even possible? Thanks

    Read the article

  • Reverse Proxy (mod_rewrite) and Rails (absolute paths)

    - by SooDesuNe
    I have front end rails app, that reverse proxies to any of a number of backend rails apps depending on URL, for example http://www.my_host.com/app_one reverse proxies to http://www.remote_host_running_app_one.com such that a URL like http://www.my_host.com/app_one/users will display the contents of http://www.remote_host_running_app_one.com/users I have a large, and ever expanding number of backends, so they can not be explicitly listed anywhere other than a database. This is no problem for mod_rewrite using a prg:/ rewrite map reverse proxy. The question is, the urls returned by rails helpers have the form /controller/action making them absolute to the root. This is a problem for the page served by mod_rewrite because links on the proxied page appear as absolute to the domain. i.e.: http://www.my_host.com/app_one/controller/action has links that end up looking like /controller/action/ when they need to look like /app_one/controller/action mod_proxy_html seems like the right idea, but it doesn't seem to be as dynamic as I would need, since the rules need to be hard coded into the config files. Is there a way to fix this server-side, so that the links will be routed correctly?

    Read the article

  • Apache httpd permissions

    - by DD.
    I have created a directory /xyz/www With the following permissions: -rw-r--r--. 1 myuser developers I edited my http.conf: DocumentRoot "/xyz/www/" <Directory "/xyz/www/"> Options Indexes FollowSymLinks AllowOverride None Order allow,deny Allow from all </Directory> I get 403 error: You don't have permission to access / on this server. Looking in the logs: (13)Permission denied: Can't open directory for index: /xyz/www/ I've tried recursively adding 777 permissions but still have the same issue.

    Read the article

  • virtual host settings fail on multiple sites

    - by Ricalsin
    Wow. I'm puzzled. On my ubuntu system I've setup an apache2 server and configured three virtual hosts in the /etc/apache2/sites-available directory. a2ensite to symbolic link the sites-enabled. The first two work great; a simple url of localhost.mysitenames.com works great for the first two sites, both finding their DocumentRoot and Directory paths. The third always generates a Bad Request (Invalid Hostname) response. No server error.log as it never hits it. I've copied/pasted the working vhost files, made the minor changes to the ServerName, DocumentRoot and Directory and the same problem persists. I always "sudo /etc/init.d/apache2 restart" whenever I make a change. I've cleared the browser cache as well. No love. There's not a limit to the number of sites you can host, right? My goal was a localhost development environment with the expectation I can run any number of websites locally before pushing them to a live server. Any thoughts on how to debug this? Or, just a simple solution I am missing?

    Read the article

  • Apache Virtualhost entry with Windows hostname

    - by gshauger
    I have a Windows Domain Controller and we use it for DNS for our internal network. I have an Ubuntu box with an IP address of 172.16.34.149. Within the Windows DNS I created the forward and reverse lookup entries for the name Endymion. Naturally when ever I FTP/SSH/HTTP/etc to the hostname Endymion it resolves correctly to my Ubuntu box. I wanted to do some web development on this box for an existing site. There were problems when I placed the website in a subfolder of /var/www/. Let's just say it was in folder /var/www/projectx/. The issue involved the incorrect resolution of non-relative urls. So I figure I could create a new DNS entry for the hostname projectx. Sure enough when I FTP/SSH/HTTP/etc to the hostname projectx it takes me to the same ubuntu box as the hostname Endymion...this is what I would expect. I now have two hostnames for the same box. I then create a Virtualhost entry in httpd.conf that looks like the following: <VirtualHost *:80> DocumentRoot /var/www/projectx ServerName projectx ServerAlias projectx </VirtualHost> Sure enough when I go to a browser and type in http://projectx/ it takes me to the correct subfolder. Everything works!!! Not so fast. I then go to http://endymion/ and instead of taking me to /var/www/ it takes me to /var/www/projectx/ Clearly I'm missing something. Help please! ;)

    Read the article

  • Apache httpd VirtualHost config - multiple sites

    - by DaFoot
    [Advised to post here from StackExchange] I have a site to work on, because of the way the URLs are built the application seems to have been created on the assumption that it will be at the server root (only app). On my dev server I have other projects and up to now a simple symlink has been working for me, but that's not the case now because this new app wants to sit at the route and process all URLs arriving on :80. Hopefully this snippet from httpd.conf will help explain what I'm trying to acheive: # default for any not matched elsewhere <VirtualHost *:80> ServerName localhost DocumentRoot /var/www/html/newproject </VirtualHost> # now try to pick out specific URLs <VirtualHost localhost/webdev> DocumentRoot /var/www/html/existingProject ServerName localhost/project </VirtualHost> Also need to be able to get same affect from wherever I'm accessing the httpd instance from. Hope that makes sense.

    Read the article

  • Mercurial (hg) commit only certain files

    - by bresc
    Hi I'm trying to commit only certain files with hg. Because of of hg having auto-add whenever I try to commit a change it wants to commit all files. But I don't want that because certain files are not "ready" yet. There is hg commit -I thefile.foo, but this is only for one file. The better way for me would be if I can turn off auto-add as in git. Is this possible? thx

    Read the article

  • git - recover deleted files from a prior commit

    - by Walter White
    I accidentally deleted some files in a prior commit and would like to recover them. How can I do this? I ran this and found exactly what I was looking for: git whatchanged --diff-filter=D At the time I made the commit, I should have committed the new/changed files only and ran a reset --hard then to recover the missing files. I have about 100 files that I need to restore. I don't want to do a straight revert as that will also undo the changes in that commit. Any ideas?

    Read the article

  • How to troubleshoot this memory usage?

    - by Camran
    I have a classifieds website. I use PHP, MySql, and SOLR. Solr uses a Servlet Container, in my case JETTY, which is java application. I just noticed that something was terribly wrong on my website. I opened the terminal and entered the "top" command and noticed that JAVA was EATING all the cpu and mem. Now I thought "Ok, maybe I need more mem and cpu" So I increased it. But along with the increase the java app started eating more. This has never happened before, and it is either a bug, or a hack of some kind. Anyways, I need to troubleshoot this now, and so I wonder how do I do this? Can I somehow pinpoint exactly when the memory usage started to go up from some error log? How does one troubleshoot this? How do I prevent it? Is it possible to prevent too many requests somehow, if they are within a timeline? Thanks

    Read the article

  • Sorting Files into Subfolders based on EXIF Date

    - by honestor
    I have a huge directory from a HDD recovery that contains 70000+ JPEG files. I tried playing around with some AppleScripts, that I found, but had no luck. I already installed EXIFtool, which might be useful for this task. The current directory structure is as follows: dir001 - file0001.jpg ... - file9999.jpg dir002 - file0001.jpg ... - file9999.jpg ... dir070 - file0001.jpg - ... - file9999.jpg The files mostly have EXIF Data, but sometimes there are Files without metadata. Now I hope to be able to sort and rename these files into folders based on the date: 1999 - 1999 01 31 - 1999_01_31_-_22_59_59.jpg 2000 - 2000 05 20 - 2000_05_20_-_21_59_59.jpg - 2000_05_20_-_22_59_59.jpg I figured Applescript/Automator might come in handy for this, however every other solution would be welcome, too!

    Read the article

  • converting apache rewrite rules to nginx

    - by Muktadir Miah
    Hello everyone, I am trying to create a UDID protected Cydia Repo but I cannot use it on nginx because of nginx does not use the .htaccess file. The file certain rewrite rules to make it run. Here are a copy of the Repo: https://github.com/ic0nic/UDID-repo Below is a copy of the .htaccess file. RewriteEngine On RewriteBase /your_repo_folder/ RewriteRule ^(Release)$ release.php RewriteRule ^(Packages.*)$ package.php

    Read the article

  • Apache Ubuntu SSL Configuration

    - by JSP
    Where besides the vhost configuration can SSL be configured? I see an SSL configuration in sites-available but it's not an enabled vhost (and the certificate it points to is expired). Using apache2 -V shows me the configuration directory is /etc/apache2 but I can not for the life of me find the SSL configuration and it's driving me crazy. Any suggestions on where to look or what I'm missing? Ubuntu 12 Linux ip-10-39-119-18 3.2.0-23-virtual #36-Ubuntu SMP Tue Apr 10 22:29:03 UTC 2012 x86_64 x86_64 x86_64 GNU/Linux

    Read the article

  • Shares not working on boot need to reinstall "File and Printer Sharing for Microsoft Networks" on server every morning to fix

    - by Neaox
    I had a problem a few days ago see question here: Can no longer access computer or network shares to my server from any other computers on the network The fix that I found does in fact work however when I boot my PC in the morning the shares are no longer working, to fix this I need to remote desktop into the server and re-install "File and Printer Sharing for Microsoft Networks" on the main adaptor. Doing this makes the shares work again, however it is anoying to have to do this each and every morning. On top of this my offline files are no longer available offline: I store my user profile on the server and had them selected to be "Always Available" however since this has happened they are no longer available offline and the option to make them available offline from the context menu is no longer available. Another problem, and I don't know if this is the cause or just a tell of a deeper issue but this server runs hyper-v, since these problems I can no longer remote desktop into the hyper-v client. Thanks for any help anyone can give.

    Read the article

< Previous Page | 253 254 255 256 257 258 259 260 261 262 263 264  | Next Page >