Search Results

Search found 74849 results on 2994 pages for 'file folder'.

Page 279/2994 | < Previous Page | 275 276 277 278 279 280 281 282 283 284 285 286  | Next Page >

  • How can I download a copy of an S3 public data set?

    - by tripleee
    i was naively assuming I could do something like s3cmd sync s3://snap-d203feb5 /var/tmp/copy but I seem to have the wrong idea of how to go about this. I cannot even get a simple thing to work; vnix$ s3cmd ls s3://snap-d203feb5 Bucket 'snap-d203feb5': ERROR: Bucket 'snap-d203feb5' does not exist I guess the identifier I have is not for a "bucket" but for a "public data set". How do I go from one to the other? Do I have to start up an EC2 instance and create a bucket for this? How? The instructions at http://docs.amazonwebservices.com/AWSEC2/latest/UserGuide/using-public-data-sets.html seem to assume I want to use the data in an EC2 instance, but in this case, I'd just like to browse a bit, at least for a start. By the by, copy/pasting the "US Snapshot ID" causes a nasty traceback from Python; they publish the ID with a weird Unicode (I presume) dash which cannot directly be copy/pasted. Is there a mistake when I copy it? And what's the significance of "US" in there? Can't I use the data outside North America??

    Read the article

  • New computer WindowsXP on Server 2003 network will not connect to file server

    - by Susan Otto
    When we try to connect to our file server with the new computer, it denies access. The computer is joined to the domain and I can see it on active directory. We need to connect to the file server for printing and terminal services. We have had this happen before and found that reinstalling Windows will fix the problem but I would like a speedier solution. any help would be appreciated.

    Read the article

  • rsync - Exclude files that are over a certain size?

    - by Rory
    I am doing a backup of my desktop to a remote machine. I'm basically doing rsync -a ~ example.com:backup/ However there are loads of large files, e.g. wikipedia dumps etc. Most of the files I care a lot about a small, such as firefox cookie files, or .bashrc. Is there some invocation to rsync that will exclude files that are over a certain size? That way I could copy all files that are less than 10MB first, then do all files. That way I can do a fast backup of the most important files, then a longer backup of everything else.

    Read the article

  • How can I archive a 30 GB file?

    - by Joel Coehoorn
    I have a 30 GB zip file containing an archive of digital materials available in the school library that I want to burn to DVD. Of course, 30 Gb is far too large for a single DVD and the content is already zipped. I'm open to ideas, but leaning towards suggestions that will help me automatically spread the file over multiple DVDs, including a simple program to stitch it back together again later.

    Read the article

  • Replace spaces in file names from cmd line unix

    - by Aly
    Hi I have a bunch of files with spaces in the name, is there a way to mv them to new files without spaces in. For example I have the file Hello World.pdf I want to move it to Hello_World.pdf. Obviously for the one file I can use the mv command but I want to do it to all files in a folder. Thanks

    Read the article

  • windows 7: Event 55 The file system structure on the disk is corrupt and unusable

    - by Radio
    Here is a real bad one! Windows 7 RTM with SP1 installed [Version 6.1.7601]. Recently tried to delete some folder on my hard drive and Windows prompted "Error 0x80070570: The file or directory is corrupted and unreadable", and at the same time placed an Event 55 describing "The file system structure on the disk is corrupt and unusable. Please run the chkdsk utility on \Device\HarddiskVolume2." Ran chkdsk, first with /f option, then with /r option. Result in both cases was: no errors found, 0 bad sectors. No problems chkdsk found at all! Went through StackExchange, Google and spent over 6 hours on this. Still cannot figure out the problem. Re-installing/Re-Formatting is not an option! What did I try: Hotfix - Windows6.1-KB982927-x64.msu - gave me an error about incompatibility with my computer, however it totally matches my system. CRC of hotfix was ok. Windows Repair Console found startup errors and fixed those, but this didn't help an issue, even by running chkdsk c: /R from it. http://support.microsoft.com/kb/246026 does not promise anything good. sfc /scannow does not help too. Replaced hard drive by cloning an old one using True Image, repeated all steps above. At the same time, some minor glitches started to appear in my Windows, like side panel and notification area settings are getting reset. Goal is to delete the folder and get rid of Event 55. Sounds like NTFS bug. Please help. This is completely ridiculous.

    Read the article

  • Media file segmenter like tool for linux

    - by Raja
    Hello everyone I'm looking for a tool for linux which can segment a video file into multiple small .ts files. I know one for Mac OS X called media file segmenter which is a simple command line tool. I really appreciate if anyone can point me to an equivalent linux tool. I don't know if these types of non-programming questions are allowed on this site. my apologies for that.

    Read the article

  • Index a set of files to search their text quickly?

    - by Ricket
    I have a unique need: I am frequently searching a large set of text files for a keyword. Right now, I open up Notepad++ and use the "Find in files" feature. It works just fine, but with the amount of files, each search takes several minutes to complete. Is there a good program more suited for this purpose, perhaps that indexes a set of files and then lets you search the set repeatedly and very quickly? It would greatly speed up my workflow.

    Read the article

  • Unable to upload a file.

    - by user39438
    We are not able to uplod a file into a sharepoint 2007 document library, suspect that it could be due to Forefront, have you faced this issue, if so, please let me know that how did you fix the issue. the funny part is, it is allowing us to upload a file sometimes (not always).

    Read the article

  • Computer Studdering When Transferring Over Network

    - by Nalandial
    This is a really weird problem that I've never even seen before. When I copy to or from my server share, my computer studders terribly and the data transfers very slowly at only around 12MB/s. By studdering I mean the mouse skips around and all my applications respond very slowly; as soon as I cancel the transfer it resolves immediately. I looked at Task Manager and the CPU is only at ~35% with plenty of RAM free. This only started semi-recently; before, I had no problems and the transfer speed maxed out the gigabit connection. I have two hard drives in my computer. When I try transferring files between drives it's fine, but when I copy from the share to either drive or to it from either drive, I get studdering. I'm running Windows 7 x64. Anyone have any idea what's going on? Any help would be much appreciated.

    Read the article

  • Moving case sensitive Linux files via Windows

    - by sunwukung
    Hi the company I work for is currently trying to move a Magento installation from one server to another - however, the product images are saved in folders alphabetically indexed folders - but with an added twist, some of the letters are the same but have a different case - i.e. a, A, b, C, D, e, E, f, F, G, h, I That being the case, when we try to drag those files down from FTP in order to move them, Windows does not honour the case sensitive distinction and we are losing several image folders. Is there a simple workaround for this issue? Any help is greatly appreciated. regards SWK

    Read the article

  • Change Drobo directory permissions

    - by Steven Scott
    I have a Drobo unit that is connected via fire-wire to a Dell notebook. Using Ubuntu 12.04. I can not seem to change the permissions to allow all users to have read/write access to the drives. The unit is automatically mounting the volumes as my user using the system, so other applications can not access the device. I want to set up a Plex Media Server to stream music, etc... but it will not scan the drives since it can not access them. How can I change the permissions to allow everyone to read the volumes? IF I add them to the fstab as ntfs volumes, Ubuntu reports that they are not available during the boot up, likely due to the fire-wire not having found the drives yet. Any ideas or suggestions would be greatly appreciated.

    Read the article

  • I disconnected my cellphone while transferring files to its Mini SD card. Now the files aren't there

    - by Martín Fixman
    I use Ubuntu 9.10, and the MiniSD card shows as having the space used as if there were files. Baobab (the disk usage analyzer) shows that the card only has 118 MB used (of the 401 Ubuntu claims there are). Of course, I already tried the obvious (rebooting the phone, adding and removing files, etc.), but I don't want to format my card, because I still have some files on it, the transfer to my computer is slow, and because I use an old wire it fails often.

    Read the article

  • "fopen: No such file or directory" error

    - by Akshay Bhat
    I am getting following cryptic error: akshay@akshay-VirtualBox:/mnt/mmpp$ ./bin/metamap10 /mnt/mmpp/bin/SKRrun.10 -L 2010 /mnt/mmpp/bin/metamap10.BINARY.Linux -Z 10 --debug input.txt fopen: No such file or directory does this error implies that it cannot fopen cannot find a required file or fopen itself is nonexistent, note that both SKRrun.10 and metamap10.BINARY.Linux are present at the correct location I am using this software http://metamap.nlm.nih.gov/ on Ubuntu.

    Read the article

  • Dropbox takes hours? to sync & shows diff. modified times (coincidentially in the future)

    - by user10580
    Dropbox is taking hours to sync, I can't tell exactly how long because the time stamps on the website make no sense - they say the files were modified. . . tomorrow. Actually my netbook (windows xp) says they're last modified tomorrow in windows explorer as well. It's bizarre. The time and date on both computers are correct. The files in question are in a symlinked directory on the laptop (which are synced fine, with the correct timestamps). I have looked for an option to force dropbox to sync, but haven't located one. (There might be a command line method, but I haven't had the time to explore). thanks

    Read the article

  • Windows XP Does Not Follow CNAME Shares

    - by user49349
    I am supporting a mix of Windows XP Pro and Windows 7 desktops in my Active Directory network, and I am having an odd issue with XP and CNAME records. Say I have a record in my DNS for a server with an A name of something like STORAGE.company.local and give it a CNAME of NAS.company.local. I can go onto an XP and 7 computer, and ping NAS and it will automatically resolve to STORAGE.company.local. If I am on Windows 7 and go to run and enter \\STORAGE or \\NAS, it will go to that server in Explorer. If I do the same in XP, STORAGE will work but NAS will not. It just times out Is there some setting buried in XP to make this work properly?

    Read the article

  • Windows - Delayed Write Failed error on USB hard drive

    - by ndngrd
    I've got a new Verbatim 1.5TB USB hard drive (Samsung HD154UI) and I'm finding myself completely unable to fill it. I'm using Windows XP. Whenever I try to copy a load of files over, it works for some time (will copy over between 20 and 90GB) but eventually stops with an error saying "The specified path is too deep" - the specified is not too deep, there's nothing more than 2 dirs deep that I'm copying. A balloon pops up at the bottom saying "Windows - Delayed Write Failed" telling me the data could not be copied. This wouldn't be too bad if I could just restart the transfer, but after this error has happened I can't write anything else to the disk - including if I eject it and then connect it to another machine. It just seems completely locked. The only way I can unlock it is to delete everything that I was copying to it. I've tried various USB cables and copying from different machines, and the same thing keeps happening.

    Read the article

  • How to copy a 200GB file faster?

    - by RainDoctor
    I got a 200GB .tgz file on server A(RHEL 5.2). I wanna transfer that file to server B (RHEL 5.3). Server B is on ESXi 4 Update1. I gave 10GB to that Server B VM, with 4 vCPUs. Both Server A and Server B are connected with an ethernet cable with local IP addies (no switch involved) scp gives me about 3Mbps. Is there a way to get 400Mbps?

    Read the article

  • Check if all files in a directory exists elsewhere

    - by aioobe
    I'm about to remove an old backup directory, but before doing so I'd like to make sure that all these files exist in a newer directory. Is there a tool for this? Or am I best off doing this "manually" using find, md5sum, sorting, comparing, etc? Clarification: If I have the following directory listings /path/to/old_backup/dir1/fileA /path/to/old_backup/dir1/fileB /path/to/old_backup/dir2/fileC and /path/to/new_backup/dir1/fileA /path/to/new_backup/dir2/fileB /path/to/new_backup/dir2/fileD then fileA and fileB exists in new_backup (fileA in its original directory, and fileB has moved from dir1 to dir2). fileC on the other hand is missing in new_backup and fileD has been created. In this situation I'd like the output to be something like fileC exists in old_backup, but not in new_backup.

    Read the article

  • nginx probably deliering wrong filetype for .css file with php tags

    - by Katai
    And again - NGINX is giving me many Questions today :) Like always, I already tried around for a while, but cant seem to fix this issue: I just configured NGINX to handle my .css files equal to my .php files (to parse PHP tags inside the CSS file). This works perfectly, and the file is found and delivered. I could debug it with FIrebug, and everything is OK (it displays the contents of the .css inside the opened <link> tag). So, everything working, right? Wrong. It has the CSS, but it does not interpret it! What I mean by this: apparently, the file-type of the CSS (or aplication-type, whatever) is wrong. The Page can access the CSS, but doesnt bother at all to actually use it. What I checked / tried: There are no PHP errors inside of the .css, so that one is out The .css is accessible. I can call the URI manually, or check if the included URL finds it: both works The .css has no syntax errors (i switched to a css that just has body {background-color: #000; } It works whitout NGINX I deleted the browser cache / restarted NGINX after config rewrites Here the configuration: server { listen 80; server_name localhost; access_log /var/log/nginx/board.access_log; error_log /var/log/nginx/board.error_log warn; root /var/www/board/public; index index.php; fastcgi_index index.php; location / { try_files $uri $uri /index.php; } location ~ (\.php|\.css)$ { try_files $uri =404; include /etc/nginx/fastcgi_params; #keepalive_timeout 0; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; fastcgi_pass 127.0.0.1:7777; } } Firebug 'Network' Response Header: Connection keep-alive Content-Encoding gzip Content-Type text/html Date Sat, 16 Jun 2012 10:08:40 GMT Server nginx/1.0.5 Transfer-Encoding chunked X-Powered-By PHP/5.3.6-13ubuntu3.7 I think I just answered my own question. Is the Content-Type text/html the problem? How can I remove that? My personal guess is that I have to use this in some way include /etc/nginx/mime.types; default_type application/octet-stream; But I'm not sure... anyone an idea how to solve this? TLDR; CSS file is delivered correctly, but it doesnt seem to be 'used' as CSS from the browser. (Tested, works on apache)

    Read the article

< Previous Page | 275 276 277 278 279 280 281 282 283 284 285 286  | Next Page >