Search Results

Search found 40479 results on 1620 pages for 'binary files'.

Page 111/1620 | < Previous Page | 107 108 109 110 111 112 113 114 115 116 117 118  | Next Page >

  • Files and folders disappeared from my desktop

    - by Rob
    Suddenly all files, folders and icons that were on my desktop have disappeared. tried making "show hidden files", no luck. tried using recovery software, but it did not find any of this files, so I assume they were not deletes. c:/Users/<myname>/Desktop does look empty from the Explorer, as well as from cmd. installed MalwareBytes, it did find and remove some malware, but it didn't seem to help. used RogueKiller and it did find some suspicious registry called "HideDesktopIcons". ¤¤¤ Registry Entries : 4 ¤¤¤ [HJ DESK] HKCU\[...]\ClassicStartMenu : {59031A47-3F72-44A7-89C5-5595FE6B30EE} (1) -> REPLACED (0) [HJ DESK] HKCU\[...]\NewStartPanel : {59031A47-3F72-44A7-89C5-5595FE6B30EE} (1) -> REPLACED (0) [HJ DESK] HKCU\[...]\ClassicStartMenu : {20D04FE0-3AEA-1069-A2D8-08002B30309D} (1) -> REPLACED (0) [HJ DESK] HKCU\[...]\NewStartPanel : {20D04FE0-3AEA-1069-A2D8-08002B30309D} (1) -> REPLACED (0) I deleted this registry, rebooted, it only unhid My Computer and User icons, but my desktop stuff is still missing — any ideas what should I do next?

    Read the article

  • Can't access windows 7 shared files on Ubuntu 11.10

    - by Corey
    I just set up ubuntu 11.10 and Samba. I got it to access shares on a Vista machine, but when I try to access the shares on a windows 7 machine it asks for a Username, Domain, and Password. I have no password set up on the windows 7 machine so I put in the username, and domain try to connect and the password prompt keeps appearing...also tried guest and admin with no luck...I've tried many different fixes(modifying registry entries & advanced securities on the win 7 machine) with no luck. Thanks

    Read the article

  • Swap files in Cloud Infrastructures

    - by ffeldhaus
    At our company we set up an OpenStack Cloud and are currently creating internal guidelines for creation of OS templates / images. One controversial topic was if we should provide swap inside the VM templates. Therefore I'd like to ask the following questions From an elastic Cloud provider point of view, does it make sense to offer swap partitions / files in the VM templates or is swap not needed when a VM can be resized? Which scenarios necessarily demand a swap file to be present? What kind of Storage should be used for swap files (e.g. local / central, FC / iSCSI / NFS)? Are there any best practices for offering swap files in a performant way in Cloud Infrastructures?

    Read the article

  • Desktop directory disappears in gnome-terminal, then appears again, but all files in it are deleted

    - by Ingen
    I am able to see my Desktop and with all its various links and files. But in the terminal when I try to access the Desktop directory: cd ~/Desktop I get: bash: cd: /home/administrator/Desktop: No such file or directory Then I find I am unable to access any of the files on the Desktop when I click on them although the file icons are there. Then the icons disappear after my clicking on them. Then I am able to access the Desktop directory in the terminal but the directory is empty i.e. all the files/folders have been deleted. What's going on? How can I fix this?

    Read the article

  • How to block access to files in the current directory with .htaccess

    - by kfir
    I have a few private files in a public folder and I want to block access to them. For example lets say I have the following files tree: DictA FileA FileA FileB FileC I want to block access to FileB and FileA in the current directory and allow access to the FileA in the DictA directory. The first thing that came to mind was to use the FilesMatch directive as follows: <FilesMatch "^(?:FileA)|(?:FileB)$"> Deny from all </FilesMatch> The problem here is that FileA inside DictA will also be blocked, which is not what I wanted. I could override that by adding another .htaccess file to DictA but I would like to know if there is a solution which wont involve that. P.S: I can't move the private files to a separate folder.

    Read the article

  • How to add recently set cookies to nginx's access log

    - by etoleb
    I'd like to include cookie data in an nginx access log like so: (simplified example) log_format foo '$remote_addr "$request" $cookie_bar'; access_log /var/log/nginx/access.log foo; This works great on requests that already have a cookie "bar", but for the first request to my server nginx will report "-" as the value of "bar". It seems like my problem is that nginx is looking at the request headers for the cookie value. Is there a way check for a Set-Cookie in the response and use that as a fallback?

    Read the article

  • Copy files with original folder structure, but to 8.3 format

    - by kokbira
    I have a folder with a lot of files and folders inside it. I would like to copy that to another location so the result is a folder with the same file and folder structure, but with all files in 8.3 format. How to do it? PS: Well, some files have extensions with more than 3 characters (e.x. home.sh3d, windows.theme etc.), so when I say about transforming all filenames to 8.3 I would like to say about transforming them to a 8.X format (i.e., to do not change extensions).

    Read the article

  • Saving small text files is slow over Win Server 2008 R2 VPN

    - by Buckers
    We have a VPN connection to our Windows Server 2008 R2 machine, and the connection works fine. Large files go back and forth fairly quickly, but we use the connection mainly for working on small text files (.aspx, .asp, .php etc). What we find very annoying is that even the smallest of files, there is a noticeable delay of between 2-5 seconds when saving any changes. As we often make changes to code and are constantly saving, this is becoming a problem. Is there anything that might be causing this delay? Or is there anything we can do to speed it up? The connection is definitely not the issue as we have a constant 5Mb upload from our server, and 20Mb+ down on the remote machines. Thanks, Chris.

    Read the article

  • Unix/Linux simple log parser (since, until)

    - by dpb
    Has anyone ever used/created a simple unix/linux log parser that can parse logs like the following: timestamp log_message \n Order the messages, parse the timestamp, and return: All messages Messages after a certain date (--since) Messages before a certain date (--until) Combination of --since, --until I could write something like this, but wasn't sure if there was something canned. It would fit well in some automated reporting I'm planning on doing.

    Read the article

  • PHP session files have permissions of 000 - They're unusable

    - by vanced
    I kept having issues with a Document Management System I'm trying to install as, at the first step of the installation process, it would error with: Warning: Unknown: open(/tmp/sess_d39cac7f80834b2ee069d0c867ac169c, O_RDWR) failed: Permission denied (13) in Unknown on line 0 Warning: Unknown: Failed to write session data (files). Please verify that the current setting of session.save_path is correct (/tmp) in Unknown on line 0 I looked in /tmp and saw the sess_* files have the following permissions ---------- 1 vanced vanced 1240 Jan 20 08:48 sess_d39cac7f80834b2ee069d0c867ac169c All the session files look like this. So obviously, they're unusable by PHP and it's causing me lots of problems. How can I get PHP to set the correct permissions? I've tried changing the directory which php.ini uses to /tmp/phpsessions and the same thing occurs. The directories are a+rwx.

    Read the article

  • hi, i have a problem with windows 7 so I use ubuntu to bring back my files , but when I tried to open the files I have this message:

    - by user286972
    Error mounting /dev/sda1 at /media/ubuntu/B800C0C300C08A38: Command-line `mount -t "ntfs" -o "uhelper=udisks2,nodev,nosuid,uid=999,gid=999,dmask=0077,fmask=0177" "/dev/sda1" "/media/ubuntu/B800C0C300C08A38"' exited with non-zero exit status 13: ntfs_attr_pread_i: ntfs_pread failed: Input/output error Failed to read NTFS $Bitmap: Input/output error NTFS is either inconsistent, or there is a hardware fault, or it's a SoftRAID/FakeRAID hardware. In the first case run chkdsk /f on Windows then reboot into Windows twice. The usage of the /f parameter is very important! If the device is a SoftRAID/FakeRAID then first activate it and mount a different device under the /dev/mapper/ directory, (e.g. /dev/mapper/nvidia_eahaabcc1). Please see the 'dmraid' documentation for more details.

    Read the article

  • svn project with linked common files

    - by Eric
    The src directory of my project is composed by three folders: two sub-projects and some common files. I linked the files of the common directory to the two sub-projects. I've just imported my project to svn but end up with three duplications of the content of the common directory. I'm wondering if svn can deal with this and how. Like an option which specify to not consider links. I thought about deleting in svn linked files from the sub-projects. Thank you, Éric.

    Read the article

  • Logformat for catching asked hostname in a *.domain.com scenario?

    - by Dhiraj Gupta
    I have an Apache 2.2 VirtualHost with a *.domain.com Servername. This is required for my scenario, all subdomains are handled with the same site. Now, in the access log, I am trying to figure out a logformat variable (or way) that will let me log the asked for domain name. If I use the vhost_combined format, all I get in my access log is *.domain.com entries, not the actual vhost that was asked for. Anyone know how to do this?

    Read the article

  • How can I easily confirm in Linux that two separate directories have the exact same contents?

    - by Mike B
    CentOS 5.x Mq question seemed similar to this one but I wasn't sure... I have two servers (completely isolated from each other), each with a directory and sub-directories that should have the same exact contents. For example the directory layout could be something like: SERVER A - /opt/foo/foob/1092380298309128301283/123.txt /opt/foo/foob/5094380298309128301283/456.txt /opt/foo/foob/5092380298309128301283/789.txt /opt/foo/foob/1592380298309128301283/abc.txt SERVER B - /opt/foo/foob/1092380298309128301283/123.txt /opt/foo/foob/5094380298309128301283/456.txt /opt/foo/foob/5092380298309128301283/789.txt /opt/foo/foob/1592380298309128301283/abc.txt Ideally I'd like a way to do a recursive check and have something confirm that everything matches. I also want to avoid using any third-party tools. Any ideas?

    Read the article

  • Search For a Query in RDL Files with PowerShell

    - by AllenMWhite
    In tracking down poorly performing queries for clients I often encounter the query text in a trace file I've captured, but don't know the source of the query. I've found that many of the poorest performing queries are those written into the reports the business users need to make their decisions. If I can't figure out where they came from, usually years after the queries were written, I can't fix them. First thing I did was find a great utility called RSScripter , which opens up a Windows dialog...(read more)

    Read the article

  • Daemon for moving files between partitions?

    - by RATHI
    I have a system with Ubuntu installed in 20GB and windows in 100 GB, two partitions - each of 100GB using NTFS. While using DC++ (multiple downloading of big file) I used to get message that system is running out of memory. Is there any way to make a deamon which will be checking the Ubuntu partition so that if its used space goes up to a certain amount (let's say 18 GB) it will automatically start a moving file from this drive to another drive (let's assume it will pick the file from movie folder or largest media file from this drive to move)? Or it prompt to ask from user which file to move? Is there any program which can do this for me? If not, can you suggest something to read so that I could make it?

    Read the article

  • Recovered video files won't play

    - by BioGeek
    I have an SD card with pictures and video which malfunctioned. I was able to recover the files with Photorec. The pictures are OK, but wen I try to open the vide files (*.mov extension) in get the following errors when I try to open them in the following programs Windows Media player: "Windows Media Player encountered a problem while playing the file" Quicktime: "Error -2048: Couldn't open the file because it is not a file that QuickTime understands" VLC: it shows the first frame of the video and the sound is just white noise The filesizes look correct so I presume the data is still in there. Is there any way to fix these recovered video files?

    Read the article

  • How to add recently set cookies to nginx's access log

    - by etoleb
    I'd like to include cookie data in an nginx access log like so: (simplified example) log_format foo '$remote_addr "$request" $cookie_bar'; access_log /var/log/nginx/access.log foo; This works great on requests that already have a cookie "bar", but for the first request to my server nginx will report "-" as the value of "bar". It seems like my problem is that nginx is looking at the request headers for the cookie value. Is there a way check for a Set-Cookie in the response and use that as a fallback?

    Read the article

  • Extract photo stills from .vob files

    - by Eric Rath
    My parents had all the family slides scanned by a photo lab. The lab returned the digital photos on two DVDs as movies; there's some stock music over a slideshow with fades between each photo. The discs contain only a handful of files, including some very large VOB files. I'd like to extract these photos and import them into iPhoto. I saw this answer about capturing stills, and that might work if I can figure out the right offset from the beginning and the right capture rate. But this approach seems very error-prone for this purpose. Is there a better way? I wish the individual photo files were stored in a directory on the discs, but they're not there.

    Read the article

  • Finding .desktop files based on their titles?

    - by stwissel
    That's part 2 of a question asked earlier (to be able to give credit to the answers individually). When I type into the Dash applications show up with their title (also when hovering over the launcher), how can I find the associated desktop file. When I look into the usual suspect locations (/usr/share/applications and ~/.local/share/applications) with Nautilus I see the titles, but not the file names (not even in properties which sucks). When I look from the command line I see the file names but not the titles (a switch would be nice). How can I get a listing (a custom column?) that shows them next to each other?

    Read the article

  • Something keeps deleting my downloaded files

    - by corroded
    I have been using utorrent for years now and recently I was surprised that I had 24GB free. I thought that was because I deleted some unused apps, but after awhile, I noticed my Torrents folder was gone(I put finished torrents in my Downloads/Torrents folder) I thought I accidentally deleted it(I use rm -r to delete huge files) so I shrugged and tried to download those 24GB back(after banging my head for the sheer stupidity) This morning, I noticed that again, my Torrent folder was gone! This made me think that something MUST be deleting my torrent files. I am not sure but my hunch is uTorrent(so I just upgraded it) or something else entirely. This is getting frustrating, so I hope someone can help me on this. My only guess is when I do CMD + w (I'm on a Mac, OSX Lion), it closes the window and somehow deletes the torrents? I am downloading files again now and will try to document what I do the tomorrow so I can add more input here.

    Read the article

  • How can I too many files upload more fast way to Cloud files in Rasckspace?

    - by andy kim
    I have a lot of image files, it's all I want to upload to RackSpace cloud files about a million in a single directory the fastest and most efficient way. but I'm use uploading python-cloudfiles script is very slow and I want to know different ways or python script code. because one by one connection upload is very slow. I think one files tar and uncompress directory is better way. but cloudfiles do not support this way. Who know any other way?

    Read the article

< Previous Page | 107 108 109 110 111 112 113 114 115 116 117 118  | Next Page >