Search Results

Search found 40479 results on 1620 pages for 'binary files'.

Page 460/1620 | < Previous Page | 456 457 458 459 460 461 462 463 464 465 466 467  | Next Page >

  • Common filesystem for servers behind a rackspace load balancer

    - by thanos panousis
    Our PHP application consists of a single web server that will receive files from clients and perform a CPU-intensive analysis on them. Right now, analysis of a single user upload can take 3sec to conclude and take 100% CPU. This makes our system capacity amount to 1/3 requests per second. My team's requirement is to increase capacity without a lot of code reengineering. A possible solution would be to set up a load balancer in front of multiple servers running the same app, connecting to a common DB. The problem is that the analysis outputs files on disk. A load balancer would increase capacity, but then files won't be available between servers so consequent client requests may fail. We are hosted on Rackspace, is there a way to configure some sort of "common" storage for all servers, without having to rewrite our file persistance code? Current code relies on simple fopens etc. What are our options?

    Read the article

  • MS Excel Vba/Macro equivalent in LibreCalc or OpenOfficeCalc

    - by ReggieCL
    is there an equivalent macro/vba in libre calc that does this routine; - Read/open xls files in a path and do a batch import/copy of read sheets and merge it with the current open workbook. Here's the vba I used in MS Excel. Thanks in advance Sub Consolidate_Sheets() 'Folder Path to read the xlsx files from Path = "F:\WIP2\Below 25\" filename = Dir(Path & "*.xlsx") Do While filename <> "" Workbooks.Open filename:=Path & filename, ReadOnly:=True For Each sheet In ActiveWorkbook.Sheets 'import/copy sheets from to read xlsx files sheet.Copy After:=ThisWorkbook.Sheets(1) Next sheet Workbooks(filename).Close filename = Dir() Loop End Sub

    Read the article

  • C drive should only contain OS. Myth or fact?

    - by Fasih Khatib
    So, I have a 500GB HDD @7200RPM. It is split as: C: 97GB D: 179GB E: 188GB My belief is to keep OS ONLY in C:\ and any adamant programs that won't go anywhere apart from C:\ [because this speeds up the PC during startup process] and install programs in D:\ so that in case I have to reinstall the OS, I will have the programs readily available after reinstall. But I have begun to think this approach is flawed because if C:\ is formatted, I will lose registry values and stuff that goes in %appdata% and so it is no use keeping programs in D:/ drive because they will be useless after all. Should I go ahead and install ALL of my programs in C:\ and then use D:\ and E:\ for storing my data like photos, text files, java files n all? How will this impact the performance of the HDD? I only have 3 programs in D:\Program Files so it will be easy to reinstall them :)

    Read the article

  • Free duplicate music finder for Mac.

    - by Jack M.
    I'm trying to clean up an mp3 folder which has a plethora of duplicate files in it due to accidentally dragging my music folder into iTunes and having it re-import songs which were already in the play list. I tried writing a quick Python app to md5 all of the files, and delete exact duplicates. This took out ~2gb of files. Unfortunately, however, this does not work on all of the duplicates because of an iTunes feature. iTunes has changed the ID3 title on some of the duplicate songs, which means the md5 of the entire file is different from the same song with a different ID3 tag. Are there any free applications out there (for the mac) which can compare the data of the actual song (ignoring ID3 tags) and determine if duplicates exist?

    Read the article

  • Folders disappear every timeWindows XP starts up

    - by Reebz
    Whenever my Windows XP machine starts up, subfolders disappear from the first top-level folder, listed alphabetically (eg. from "C:\AA Backups"). The first time it happened I suspected user error (such as an unintentional delete or copy). But I then found it happens on every start-up, sometimes affecting huge numbers of files. Renaming the affected folder (eg to "ZZ Backups") just means that a different folder is affected the next time. Avast found no virus or malware that would seem to be responsible. The missing files are not visible to an undelete utility such as NTFSUndelete. Running "chkdsk/f" found no problems and did not fix the problem. File permissions also appear corrupted - a few files which should be accessible are missing "read" permission. What's happened to this machine?? Any ideas or reports of similar experiences would be most welcome.

    Read the article

  • Linux - File was deleted and then reappeared when folder was zipped

    - by davee9
    Hello, I am using Backtrack 4 Final, which is a Linux distro that is Ubuntu based. I had a directory that contained around 5 files. I deleted one of the files, which sent it to the trash. I then zipped the directory up (now containing 4 files), using this command: zip -r directory.zip directory/ When I then unzipped directory.zip, the file I deleted was in there again. I couldn't believe this, so I zipped up the directory again, and the file reappeared again but this time could not be opened because the operating system said it didn't exist or something. I don't remember the exact error, and I cannot make this happen again. Would anyone happen to know why a file that was deleted from a directory would reappear in that directory after it was zipped up? Thank you.

    Read the article

  • IIS: redirect everything to another URL, except for one Directory

    - by DrStalker
    I have an IIS server (IIS 6, Win 2003) that hosts the site http://www.foo.com. I want any request to http://foo.com (no matter what path/filename is used) to redirect to http://www.bar.org/AwesomePage.html UNLESS the request is for http://www.foo.com/specialdir, in which case the HTML files in the local directory specialdir should be used. The problem I have is once the redirect is set it also affects /specialdir - even if I right click on that directory and select "content should come from ... local directory" that change does not take effect, and the directory still shows as redirecting to http://www.bar.org/AwesomePage.html. The same thing happens if I try to set individual files to load from the local system instead of redirecting - IIS gives no error, but the change does not take effect and the files still show as being redirected. How can I set specialdir to override the redirection to the new URL?

    Read the article

  • How do I convert a .vhd disk image to work with VMWare Fusion 2?

    - by Paul D. Waite
    I’ve just installed VMWare Fusion 2 on my Mac. Microsoft makes available some Virtual PC disk images containing different versions of IE, so that us humble web developers can test our code on them: http://www.microsoft.com/downloads/details.aspx?FamilyId=21EABB90-958F-4B64-B5F1-73D0A413C8EF&displaylang=en I want to convert these .vhd files to work with VMWare Fusion 2. Note: VMWare Fusion 3 can import .vhd files natively (File Import). This works just fine on the Microsoft IE compatibility VMs. I’ve tried VMWare Converter Standalone on Windows, but it doesn’t work with .vhd files (as of the current version, 4.0.1). Any ideas? VMWare’s website is confused corporate hell.

    Read the article

  • Convert SWF/FLA back to PDF [closed]

    - by mitjak
    Possible Duplicate: How do I convert SWF into a PDF? I have a number of graphics and text only SWF and FLA files that I'd like to convert to PDF preserving the text and formatting. Printing to PDF from Flash Player seems to flatten everything into an image without preserving the text. I've got a large number of those files, so it would be quite nice to have a automatable/console utility to do the job. Appreciate any help or pointers. Thanks! EDIT: It seems like most SWF files were created with pdf2swf, so I'm hoping there is a similar tool for doing a reverse conversion.

    Read the article

  • Why can't I index a SUBST'd drive in Windows7?

    - by Andy
    I've got a SUBST for a folder to drive letter P: I have noticed that exploring these folders from P: is now INCREDIBLY slow, taking up to a minute sometimes to show files. I'm showing them as general files and not thumbnails, so it's not that. Looking at the original folder in explorer is lightning fast. I've checked the indexing options and indeed the folder where my files are stored is checked as indexed. I can see my P: drive in the list, but clicking on the checkbox won't do anything. It's not even checkable. Does anyone have any clues as to how I can fix this? (Running Windows 7 just to be clear).

    Read the article

  • Moving Windows 7 ProgramData folder after installation

    - by thinkzig
    I need to move my C:\ProgramData folder in a Windows 7 installation to D:\ProgramData. I understand how to make the symlinks and registry changes so this works. My problem is that I'm unable to copy the files in the ProgramData folder because the OS seems to have some of them locked. Specifically, the files in the C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys folder are blocking the move. Am I out of luck here? Is there any way to move the folder, create the symlink, and update the registry without any of the files in these folders being locked?

    Read the article

  • Importing orphaned Outlook 2010 OST file

    - by BigBadJock
    I have a problem with Outlook 2010 and OST files. First my exhange hosting company deleted my exchange account by accident. They've created it on another server, but can't get the data back. Now I did make a copy of the \users\name\appdata\local\outlook directory. So I have the original OST files. I decided to switch hosts to Office 365. During this, I stupidly deleted my account from within outlook and recreated it to point to Office 365. And only then did I learn that you can't import from OST files. Edited to clarify: I have a complete backup of the pc. Which folders would I need to restore to ensure that I can get exhange back it's previous state? I'm prepared to to a complete restore if necessary, but would prefer to localise the changes.

    Read the article

  • LAME: Switch sample rate of file without reencoding?

    - by TK Kocheran
    Is it possible to resample an MP3 file to a different rate (44.1) without doing a reencode? I have a few MP3 files that are at 48 and I need to switch 'em to 44.1, and I don't want to have to reencode my files to do so, as I'll lose quality. The source files are at CBR 320 and at 48kHz. Can this be done? The current way I'm doing it is using the following command: lame -b 320 -q 0 --resample 44.1 input.mp3 output.mp3 Is there a better way to do this?

    Read the article

  • Homedir inside homedir restricted access

    - by blid
    On my VPS i've installed debian, apache+php. I have 2 users: foo and bar. Apache is configured to execute php files from /home/foo/htdocs. I created dir: /home/foo/htdocs/bar/ and made it home dir for user bar. Hover, I need to make a restriction: bar can't read, write or executre any files outside his own dir, but Apache has to be able execute all php files from /htdocs. I tried to chown the bar dir only for user bar, also experimented a lot with chmod but without a result so far. If there's any better way to satisfy my needs don't hesitate to write about it. Thanks in advance

    Read the article

  • SSH and Latent Connections (e.g., satellite connections)

    - by user71494
    Most of the week I live in the city where I have a typical broadband connection, but most weekends I'm out of town and only have access to a satellite connection. Trying to work over SSH on a satellite connection, while possible, is hardly desirable due to the high latency ( 1 second). My question is this: Is there any software that will do something like buffering keystrokes on my local machine before they're sent over SSH to help make the lag on individual keystrokes a little bit more transparent? Essentially I'm looking for something that would reduce the effects of the high latency for everything except for commands (e.g., opening files, changing to a new directory, etc.). I've already discovered that vim can open remote files locally and rewrite them remotely, but, while this is a huge help, it is not quite what I'm looking for since it only works when editing files, and requires opening a connection every time a read/write occurs. (For anyone who may not know how to do this and is curious, just use this command: 'vim scp://host/file/path/here)

    Read the article

  • Command output as string

    - by rik
    I want to get output from command C:\Program Files (x86)\Java\jre7\bin\java.exe" -version as string variable. I tried this way: $out = &"C:\Program Files (x86)\Java\jre7\bin\java.exe" -version but it gives error message: java.exe : java version "1.7.0_05" At line:1 char:9 + $out = & <<<< "C:\Program Files (x86)\Java\jre7\bin\java.exe" -version + CategoryInfo : NotSpecified: (java version "1.7.0_05":String) [], RemoteException + FullyQualifiedErrorId : NativeCommandError Java(TM) SE Runtime Environment (build 1.7.0_05-b05) Java HotSpot(TM) Client VM (build 23.1-b03, mixed mode, sharing) $out variable seems empty. What am I doing wrong?

    Read the article

  • SAMBA and Linux ACLs -- "Permission denied" on write to share but file written nevertheless

    - by MCH
    I set up a writable share directory "/home/net/share" with acl like this: sudo mkdir -p "/home/net/share" sudo setfacl -m "u:localuser:rwx,u:remoteuser:rwx,g:users:rwx" "/home/net/share" My /etc/samba/smb.conf looks like this: [global] workgroup = w server string = server security = user load printers = no log file = /var/log/samba/%m.log max log size = 50 dns proxy = no printing = bsd printcap name = /dev/null disable spoolss = yes encrypt passwords = true invalid users = nobody root follow symlinks = yes wide links = yes [share] comment = Writable by localuser and remoteuser path = /home/net/share valid users = remoteuser read only = no public = no printable = no Locally, localuser and remoteuser have user accounts and smbpasswds and can both read, create and delete files in /home/net/share. But when I log on from a different machine (like this: sudo mount -t cifs //server/share mountpoint/ -o username=remoteuser ), I get "Permission denied" both when trying to create directories and files, oddly though, it does create files (not directories!) despite these messages! How can I get this working?

    Read the article

  • afp/smb transfers caps at 2 megabytes/sec, wireless N

    - by RD.
    I wanted to transfer files between two mac computers. The network is wireless-N and both computers have wireless-N modules in them. The problem is that when I transfer files between them, via file sharing (afp) the network speed caps at 2 megabytes/sec. Just downloading files from the internet I can get faster speeds, so this isn't a constriction of my wifi bandwidth, it appears to be a constriction of the protocol being used. My wifi-n is set to 130mbits, so I should see real world transfer speeds around 12-16 megabytes/sec I did this command on both computers sudo sysctl -w net.inet.tcp.delayed_ack=0 which is supposed to lower tcp overhead, but this did not affect it. How can I get the speed I am expecting?

    Read the article

  • Entourage to Outlook Migration questions

    - by George Bluff
    I am currently migrating a users information from a pop email account to my exchange server. I have already migrated them over to my hosted exchange, and their email is following properly. Now, the user is moving from Entourage on a Mac (10.7) to Outlook 2010 on a PC (Windows 7). I was wondering what the easiest way was to migrate him since there is no .pst files. I have been able to get his email over by dragging the inbox from Entourage to the desktop, then converting the files to .eml using IMAPSize, importing them to Outlook Express (which will only work on Windows XP), then exporting to a pst, then importing in the new account. Takes awhile with large emails, but it works. The issue I am now having is for calendar items. I exported the calendar and got a folder with all the .ics files, but Outlook 2010 doesn't seem to have an easy way to import all of them. Any thoughts?

    Read the article

  • SSD Performance for PHP?

    - by Andrew Fashion
    My programmer just built an application with PHP using Doctrine ORM (will be a high traffic social networking website), and it's very heavy in PHP/Apache and CPU. The queries are wonderfully fast, and MySQL is barely using any CPU, it's just Apache. I was curious to if an SSD would help speed up PHP/Apache, because I know the bottleneck is in PHP reading multiple files, class files, and loading up a bunch of data. So common sense makes me think if PHP is reading multiple PHP files, an SSD would only help as far as read/write? I was thinking of doing a high performance SSD for the PHP application, but for user image uploads, I would just continue using a 15k SAS. Is there any performance issues regarding using an SSD in this kind of situation? And would it prove to help speed up PHP/Apache, and help the CPU problem out?

    Read the article

  • Opening a Corrupted winmail.dat file

    - by tearman
    I have a set of winmail.dat files that apparently have evolved from a set of emails with corrupted headers. It looks like Exchange 2010 changed the headers around sometime last September and basically rendered exported .eml files unable to open. Now the HTML/PlainText emails seem to do ok, but the files that use RichText (specifically Microsoft's TNEF format) will not open in any program, Microsoft or not. I've attempted to use many different non-Microsoft converters and they see it as a corrupted message as well. If I remove the headers of the email, rename it as a winmail.dat file, some emails will open in Word, but most won't. If you take a look at the email in a text editor, there are null characters EVERYWHERE that distort the email itself. Anybody have any experience with this and/or suggestions on how to at least open it?

    Read the article

  • Is there a program that will show a tree of the differences in two file trees?

    - by Huckle
    In windows I manually back up from time to time by formatting my external drive and copying the contents of my data partition over. Inevitably there is a difference in the number and size of the files copied because of system files, etc. Is there a program that would diff two directories recursively and compile the differences into a nice GUI tree that I could peruse (preferably filter) to ensure that everything I want made it over to the drive? It should only show files that are not in both directories. (Also, please ignore the inadequacy of my backup solution)

    Read the article

  • Find out the size of a .tar.gz archive in the terminal without unpacking

    - by Sven
    I have a 32GB .tar.gz archive and I'd like to know the size of the files if I unpack this compressed archive. I'd like to avoid unpacking the archive first and than use e.g. du. Is it also possible to find out the size of the contained files without unpacking the compressed archive (on a Linux and/or MacOSX system)? For another archive I know, that it also contains .tar.gz files. Is it also possible to calculate the size of the unpacked archives that are contained within an archive? (for example by setting a level to which the "unpacking" should be simulated?)

    Read the article

  • Crap, hard disk failure. Can I recover my "move"d folders?

    - by Doug
    I am in the process of moving all my files from an old laptop to new one. I just moved 11gb of data from my old laptop to a hard drive (external) and then upon moving it out to the new hard drive, the hard drive is getting a CRC (Data Error (Cyclic Redundancy Check). Now I am looking for a solution to recover the files that I moved on my old laptop (not the external). I understand they they are just marked for potential overwriting to free up space. I was getting ready to test out GetDataBack, but it says to install it on a healthy windows and use the recover-needed drive as an external. However, I don't want to turn off my computer without first getting the okay since it is in a "moved" state. Please help! What can I do to recover the Moved files. I haven't touched the computer since it has been moved. What can I use to recover them?

    Read the article

  • How to use chain.p7b with Apache?

    - by Debianuser
    I wanted to setup a SSL website on Apache and applied for a certificate from my local ISP. All they sent me was a single file named chain.p7b. I have always used certificates from other vendors without any issues but they usually provide two files to be configured as SSLCertificateFile and SSLCertificateChainFile in Apache. Following instructions from several online resources, I opened the p7b file in Windows and extracted 4 certificates from the file. I then tried configuring Apache with one of the files and it worked, but shows a warning: The certificate is not trusted because no issuer chain was provided. I though I have to use remaining 3 files as SSLCertificateChainFile and/or SSLCACertificateFile. I tried that but it didn't work so I am assuming it might be something completely different. Anyone faced this issue before? The following page http://www-01.ibm.com/support/docview.wss?uid=swg21458997 talks about using a keystore but is that relevant to Apache?

    Read the article

< Previous Page | 456 457 458 459 460 461 462 463 464 465 466 467  | Next Page >