Search Results

Search found 37883 results on 1516 pages for 'sparse files'.

Page 450/1516 | < Previous Page | 446 447 448 449 450 451 452 453 454 455 456 457  | Next Page >

  • Merging 3 apps - 2.msi and 1.exe

    - by Netguy
    EXACT duplicate of Combining three MSI into a single installer Hi Guys , I am having 3 files - Alky for Applications.msi ( which make Vista Apps work on XP) 2. Windows VIsta sidebar.exe ( Which make that VIsta sidebar work on XP) *3.Gadget Extractor.ms*i ( A part of number 2) Now , the problem is that all the 3 applications are installers and I want to merge them to 1 installer . So please tell me what should I do and I also want to remove some content( normal files) from 2. Note: I do NOT want to bind the files , so that 3 installers start at the same time. I want to make them into one The Person who is able to help me gets a VPS with cPanel with RL/TF allowed :D

    Read the article

  • Disabling FileSystemWatcher for specific updates?

    - by chaiguy
    Does anyone have any ideas how I can reliably disable a FileSystemWatcher object when my application makes changes to the files in the directory, so that I am only watching for external changes to the directory? I've tried setting EnableRaisingEvents to false immediately before performing a write and setting it back to true immediately after, but it seems this method is not reliable, and occasionally I still get the event firing. The only other thing I can think of is to wait a small amount of time after performing the write to let the OS finish up the modification of the directory before re-enabling the FSW, but that seems hackish and I don't like it. To add to the problem, the directory consists of potentially many files, the identities of which are beyond my knowledge and control, so I can't just wait for the event to fire for a specific file and then ignore it. There could be any number of FSW events firing after a single modification (because of the potentially many files getting updated).

    Read the article

  • Linux server became extremely slow

    - by Ariel Aharonson
    I have a file sharing website, and my files hosted in a server with those system specifications: 32GB RAM 12x3TB 2x Intel Quad Core E5620 I have files in this server up to 4gb for each file. 446gb is full (/36TB) [root@hosted-by ~]# df -h Filesystem Size Used Avail Use% Mounted on /dev/sda2 50G 2.7G 44G 6% / tmpfs 16G 0 16G 0% /dev/shm /dev/sda1 97M 57M 36M 62% /boot /dev/mapper/VolGroup01-LogVol00 33T 494G 33T 2% /home And take a look at this: Why is the wa% so high? (I think that what makes the server to be so slow)

    Read the article

  • How to open different App version for one given file extension

    - by Erik Lenaerts
    We have a data files with an extension ".ppx" for our business app here. Users will typically have multiple versions of the application installed (side by side) for example version 1 and version 2. The ppx files are xml files and they contain the version of the app they were created from (v1 or v2). Lets say that we have AFileCreatedWithAppv1.ppx and AFileCreatedWithAppv2.ppx opens with version 1 or version 2 of our app respectivly when they both have the same file extension? It must be doable since that is what Visual Studio does. In fact, they even provide different icons for the same .sln extension to indicate what Visual Studio version it will open with. I learned that Visual Studio is using the Selector or Launcher in between, but then again, how do they change the icons in Windows? cheers :)

    Read the article

  • How can I send email attachment without using an additional library in Perl?

    - by CheeseConQueso
    Hey, I was wondering if there is a way to attach files (specifically .csv files) to a mail message in Perl without using MIME::Lite or any other libraries. Right now, I have a 'mailer function' that works fine, but I'm not sure how to adapt it into attaching files. Here is what I have: open(MAIL, "|/usr/sbin/sendmail -t"); print MAIL "To: cheese\@yahoo.com\n"; print MAIL "From: queso\@what.com\n"; print MAIL "Subject: Attached is $filename\n\n"; print MAIL "$message"; close(MAIL); I think this is specific for UNIX.

    Read the article

  • Windows Server 2008 Software Raid 5 - Data integrity issues

    - by Fopedush
    I've got a server running Windows Server 2008 R2, with a (windows native) software raid-5 array. The array consists of 7x 1TB Western Digital RE3 and RE4 drives. I have offline backups of this array. The problem is this: I noticed a few days ago after copying a large file to the disk that there was an integrity issue with that file - it was a ~12GB file that I had downloaded via uTorrent. After moving it to the raid array, I used uTorrent to relocate the download location, and performed a re-check so I could seed it from that location. The recheck found that only 6308/6310 chunks of the copied file were intact. My next step was to write a quick powershell script that would copy files to the array, while performing a SHA1 hash of the original and resultant files and comparing them. Smaller files (100-1000MB) copied over just fine. When I started copying larger data (~15GB), I found that the hash check failed about 2/3rds of the time. The corrupt files had very, very small inconsistencies - less than .01%. I further eliminated the possibility of networking or client issues by placing this large file on the C:\ of the server, and copying it repeatedly from there to the array, seeing similar results. Copying the data via explorer, powershell, or the standard windows command prompt yield the same results. None of the copies fail or report any problems. The raid array itself is listed as healthy in disk management. After a few experiments, I shut down the server and ran memtest overnight. No errors were detected. A basic run of chkdsk found no problems, but I did not use the /R flag, as I was unsure how that might affect a software raid-5 volume. I next ran Crystal Disk Info to check the smart data on the drives - but found that CDI only detected 5 out of 7 of the disks in the array. I have no idea why. Nevertheless, CDI shows the following "caution" flags on a single one of the drives: 05 199 199 140 000000000001 Reallocated Sectors Count C5 200 200 __0 000000000001 Current Pending Sector Count Which is a little bit alarming, but I don't really know what to do with the information. I hardly feel like one reallocated sector could be causing this. At this point, I'm looking for some guidance on what to do next. I need to determine the cause of this issue, but I'm hesitant to run chkdsk /R or any bootable disk health checkers because I'm afraid they might break the array. I've considered triggering a re-sync of the array, but I'm not actually sure how to do that without doing something silly like manually dropping a disk and then restoring it. Any advice that could help me ferret out the precise cause of this issue would be greatly appreciated.

    Read the article

  • HOw I can verify my SQL / SQL Pl syntax

    - by rima
    Hi all Sorry my English is bad.I hope u can get what I want. I have lots of *.sql files that i want to write a program to compile them and if there is any issue(problem or mistake) report me. One of my friend write an IDE for java,as I remember he use javac to generate the codes error,in other hand maybe u see when u try to write code in a Visual stadio or Netbean the IDE generate errors for u.so now I want to know any one have any idea how I can do it for my sql files? In other mean I want to write a Editor for SQL files(PL/SQL) that compile my code and tell me what is my error. this problem raise up when I try to compile all of them in SQL PLUS,it's so boring. please help me...

    Read the article

  • Secure, efficient, version-preserving, filename-hiding backup implemented in this way?

    - by barrycarter
    I tried writing a "perfect" backup program (below), but ran into problems (also below). Is there an efficient/working version of this?: Assumptions: you're backing up from 'local', which you own and has limited disk space to 'remote', which has infinite disk space and belongs to someone else, so you need encryption. Network bandwidth is finite. 'local' keeps a db of backed-up files w/ this data for each file: filename, including full path file's last modified time (mtime) sha1sum of file's unencrypted contents sha1sum of file's encrypted contents Given a list of files to backup (some perhaps already backed up), the program runs 'find' and gets the full path/mtime for each file (this is fairly efficient; conversely, computing the sha1sum of each file would NOT be efficient) The program discards files whose filename and mtime are in 'local' db. The program now computes the sha1sum of the (unencrypted contents of each remaining file. If the sha1sum matches one in 'local' db, we create a special entry in 'local' db that points this file/mtime to the file/mtime of the existing entry. Effectively, we're saying "we have a backup of this file's contents, but under another filename, so no need to back it up again". For each remaining file, we encrypt the file, take the sha1sum of the encrypted file's contents, rsync the file to its sha1sum. Example: if the file's encrypted sha1sum was da39a3ee5e6b4b0d3255bfef95601890afd80709, we'd rsync it to /some/path/da/39/a3/da39a3ee5e6b4b0d3255bfef95601890afd80709 on 'remote'. Once the step above succeeds, we add the file to the 'local' db. Note that we efficiently avoid computing sha1sums and encrypting unless absolutely necessary. Note: I don't specify encryption method: this would be user's choice. The problems: We must encrypt and backup 'local' db regularly. However, 'local' db grows quickly and rsync'ing encrypted files is inefficient, since a small change in 'local' db means a big change in the encrypted version of 'local' db. We create a file on 'remote' for each file on 'local', which is ugly and excessive. We query 'local' db frequently. Even w/ indexes, these queries are slow, since we're often making one query for each file. Would be nice to speed this up by batching queries or something. Probably other problems that I've now forgotten.

    Read the article

  • Moving multiple folders all at once in Outlook

    - by Luke
    Constantly at my shop, we are moving Outlook (or other email program) files between computers or Windows Installations, and sometimes, people have HUNDREDS of folders. Is there a quick way to move ALL the folders from multiple data files (*.PST) into one single file, without dragging each and every folder? No, I don't want to move the Inbox folder into the other Inbox folder for the quick move, I want something simple like selecting all folders and moving that way. Does such a method exist in any version of Outlook?

    Read the article

  • Can Linux play HDMI 1.4a 3D stereoscopic content?

    - by SofaKng
    I'm aware that there are no Bluray players for Linux but I'm wondering if it's possible to play Full 3D HD (1080p, Side-By-Side) MKV files (or Bluray BDMV folders, etc). Full 3D HD files are actually two 1080p frames "side-by-side" so the effective resolution is 3840x1200. In order to play these properly the software needs to switch to TV into 3D mode (or however HDMI 1.4a works). I don't think simply playing the 3840x1200 resolution file will work so are there any options out there?

    Read the article

  • Zend Server Cannot restart PHP: permission denied for user

    - by user30115
    When I click "Restart PHP" in Zend Server web interface, I get this error in the logs: PHP Warning: Cannot restart PHP: permission denied for user IIS APPPOOL\DefaultAppPool. in C:\Program Files (x86)\Zend\ZendServer\GUI\application\CE\models\ZwasComponents\Util\Api\UserServer.php on line 86 Based on http://kb.zend.com/index.php?View=entry&EntryID=426 I tried to give permissions to user IIS APPPOOL\DefaultAppPool to the folder C:\Program Files (x86)\Zend\ZendServer\, however it still gives the same error. Do you know to what resources the application pool does not have permissions to?

    Read the article

  • Most efficient approach for multilingual PHP website

    - by alexteg
    I am working on a large multilingual website and I am considering different approaches for making it multilingual. The possible alternatives I can think of are: The Gettext functions with generation of .po files One MySQL table with the translations and a unique string ID for each text PHP-files with arrays containing the different translations with unique string IDs As far as I have understood the Gettext functions should be most efficient, but my requirement is that it should be possible to change a text string in the original reference language (English) without the other translations of that string automatically reverting back to English just because a couple of words changed. Is this possible with Gettext? What is the least resource demanding solution? Is using the Gettext functions or PHP files with arrays more or less equally resource demanding? Any other suggestions for more efficient solutions?

    Read the article

  • SQL 2008 Report Manager not working

    - by Fatherjack
    I have a SQL 2008 developer edition with SSRS and the report manager is only available from the local machine. If I try to access it from any other machine I get challenged for my domain u/name and pwd 3 times and then the screen stays blank. I have made changes to some config files (originals copied out) in order to get a 3rd party application to run but that is now uninstalled and the config files are all back to vanilla (originals copied back in) I feel its something to do with authentication but am stuck ... any suggestions welcomed Jonathan

    Read the article

  • How do you access timemachine backups of a different computer?

    - by baloo
    I'm currently backing up using WD My Book World network drive that supports Apple Time Machine. I would like to copy some files from my old laptop backup. However, my old backup isn't showing when you browse the time machine network drive, only the currently used machine is listed (I know there are 3 different backups). How can I access those files not belonging to the currently used laptop?

    Read the article

  • How to set up a file server in a restricted corporate environment

    - by Emilio M Bumachar
    I work in a big corporation, and the disk space my team gets in the corporate file server is so low, I am considering turning my work PC into a file server. I ask this community for links to tutorials, software suggestions, and advice in general about how to set it up. My machine is an Intel Core2Duo E7500 @ 3GHz, 3 GB of RAM, Running Windows XP Service Pack 3. Upgrading, formatting or installing another OS is out of the question. But I do have Administrator priviledges on the PC, and I can install programs (at least for now). A lot of security software I don't even know about is and must remain installed. But I only need communication whithin the corporate network, which is not restricted. People have usernames (logins) on the corporate network, and I need to use them to restrict access. Simply put, I have a list of logins of team members, and only people in the list should access the files. I have about 150 GB of free disk space. I'm thinking of allocating 100 GB to the team's shared files. I plan monthly backups on machines of co-workers, same configuration. But automation of backups is a nice, unnecessary feature: it's totally acceptable for me to manually copy the contents to a different machine once a month. Uptime is important, as everyone would use these files in their daily work. I have experience as a python and C programmer, but no experience whatsoever as a sysadmin, and almost nothing of my programming experience is network programming. I'm a complete beginner in this. Thanks in advance for any help. EDIT I honestly appreciate all the warnings, I really do, but what I plan to make available is mostly stuff that now is solely on DVDs just for space reasons. It's 'daily work' to read them, but 'daily work write' files will remain on the corporate server. As for the importance of uptime, I think I overstated it: a few outages are OK, it's already an improvement over getting the DVDs. As for policy, my manager is kind of on my side, I will confirm that before making my move. As for getting more space through the proper channels, well, that was Plan A, and it's still on the table... But I don't have much hope. I'm not as "core businees" as I'd like.

    Read the article

  • setting up a sub domain on windows hosting

    - by jason
    I'm trying to set up a sub domain for development on a windows server and am having problems setting the correct details in the httpd.ini file and hoped someone could help. I have set up the subdomain http://dev.website.com The files that I want to use for this subdomain are on the server in a folder called development http://www.website.com/development in the directory structure they are in /htdocs/development What do I need to add the the httpd.ini file to point the http://dev.website.com to the files located in the /htdocs/development folder on the server?

    Read the article

  • Firefox: Where does firefox store the opened windows/tabs/urls on Crash for Restoring?

    - by jens
    Hello in which location and file does firefox save, the last windows I had opened (when firefox crashed). I have a complete "hot dump" copy of a file system and need to restore the state firefox was when the system crasehd but I cant not restore the full backupitself. I can only extract the files of firefox, but I do not know in which files i have to search for the urls that were last opened when the snapshop of the whole filesystem was done. thanks!!!

    Read the article

  • What format is "undf"?

    - by ZlateWay
    By accident, I recently removed some videos from my phone. I tried to restoring them with Recuva and TuneUp Utilities and the results were "undf" files. I tried to open with VLC, Media Player Classic and other video players to no avail. Where/how do I find out the codec for these files? I would really love to be able to watch these videos again.

    Read the article

  • Windows command line compression/extraction tool?

    - by Will Marcouiller
    I need to write a batch file to unzip files to their current folder from a given root folder. Folder 0 |----- Folder 1 | |----- File1.zip | |----- File2.zip | |----- File3.zip | |----- Folder 2 | |----- File4.zip | |----- Folder 3 |----- File5.zip |----- FileN.zip So, I wish that my batch file is launched like so: ocd.bat /d="Folder 0" Then, make it iterate from within the batch file through all of the subfolders to unzip the files exactly where the .zip files are located. So here's my question: Does the Windows (from XP at least) have a command line for its embedded zip tool? Otherwise, shall I stick to another third-party util?

    Read the article

  • nginx + php fpm -> 404 php pages - file not found

    - by Mahesh
    *2037 FastCGI sent in stderr: "Primary script unknown" while reading response header from upstream server { listen 80; ## listen for ipv4; this line is default and implied #listen [::]:80 default ipv6only=on; ## listen for ipv6 server_name .site.com; root /var/www/site; error_page 404 /404.php; access_log /var/log/nginx/site.access.log; index index.html index.php; if ($http_host != "www.site.com") { rewrite ^ http://www.site.com$request_uri permanent; } location ~* \.php$ { fastcgi_index index.php; fastcgi_pass 127.0.0.1:9000; #fastcgi_pass unix:/var/run/php-fpm/php-fpm.sock; fastcgi_buffer_size 128k; fastcgi_buffers 256 4k; fastcgi_busy_buffers_size 256k; fastcgi_temp_file_write_size 256k; fastcgi_read_timeout 240; include /etc/nginx/fastcgi_params; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; fastcgi_param SCRIPT_NAME $fastcgi_script_name; } location ~ /\. { access_log off; log_not_found off; deny all; } location ~ /(libraries|setup/frames|setup/libs) { deny all; return 404; } location ~ ^/uploads/(\d+)/(\d+)/(\d+)/(\d+)/(.*)$ { alias /var/www/site/images/missing.gif; #i need to modify this to show only missing files. right now it is showing missing for all the files. } location ~* \.(jpg|jpeg|png|gif|ico|css|js)$ { access_log off; expires 20d; } location /user_uploads/ { location ~ .*\.(php)?$ { deny all; } } location ~ /\.ht { deny all; } } php-fpm config is default and is not touched. The problem is little strange for me. Error pages are showing File not found only if they are .php files. Other error files are clearly calling the 404.php file site.com/test = calls 404.php site.com/test.php = File not found. I am searching and making changes. but it hasn't solved the problem.

    Read the article

  • open encrypted file with php

    - by stormdrain
    Hi, I'm looking for a way to open encrypted files that are stored on a server. I'm using mcrypt to encrypt the files. I was initially going to create a class that would open the file, decrypt it, write it to a new location, then open that. But I have convinced myself there is a better way (I just don't know what it is). It seems like there should be a way to stream it (?) to the browser once it's decrypted. The initial setup would just link to the file location and the browser would take over (e.g. .pdf files would bring up a dialogue offering to open or save the file). If possible, I'd like it to do the same after decoding. Pointers? Advice? Bueller? Thanks!

    Read the article

  • Could not upload .htaccess

    - by syalam
    I am using Springloops to automatically take my SVN repo and deploy onto my server. I am getting the following error: Could not upload .htaccess Could not upload .htaccess using BINARY transfer ---------------------------------------------------- Connecting to dev.convrrt.com Logging in as convrrt Entering destination directory ~/ Entering passive mode REVISION: 1 -> 30 Getting changes Deleting files Removing directories Creating directories and files Extracting file: .htaccess...OK Uploading file: .htaccess [644] R: interrupted How can I diagnose this?

    Read the article

  • Removing orphaned iTunes songs from hard drive

    - by JubJub
    Some times when I delete songs from iTunes it does not ask me if I also want to delete it from the hard drive. As a result I have a bunch of songs that are taking up space in my hard drive that are not in iTunes. I have over 3000 files, is there an automated way to find files on the hard drive that are NOT in iTunes? I want to delete them so that they are not taking up space.

    Read the article

< Previous Page | 446 447 448 449 450 451 452 453 454 455 456 457  | Next Page >