Search Results

Search found 41795 results on 1672 pages for 'hidden files'.

Page 119/1672 | < Previous Page | 115 116 117 118 119 120 121 122 123 124 125 126  | Next Page >

  • MySQL open files limit

    - by Brian
    This question is similar to set open_files_limit, but there was no good answer. I need to increase my table_open_cache, but first I need to increase the open_files_limit. I set the option in /etc/mysql/my.cnf: open-files-limit = 8192 This worked fine in my previous install (Ubuntu 8.04), but now in Ubuntu 10.04, when I start the server up, open_files_limit is reported to be 1710. That seems like a pretty random number for the limit to be clipped to. Anyway, I tried getting around it by adding a line like this in /etc/security/limits.conf: mysql hard nofile 8192 I also tried adding this to the pre-start script in mysql's upstart config (/etc/init/mysql.conf): ulimit -n 8192 Obviously neither of those things worked. So where is the hoop that has been added between Ubuntu 8.04 and 10.04 through which I must jump in order to actually increase the open files limit?

    Read the article

  • Join .doc files into one .doc (with keeping the original format of every document)

    - by Shiki
    I have about ~50 .doc files, that look perfect (they are extracted with Able2Extract). Now I want to join these 50 files into one huge .doc. I've tried using Word's in-built "Insert" feature, but that messed up the whole format. I want to keep everything I have. Like just document1 - document2 - document3. Nothing "intelligent" or "smart" needed during the conversion, just the capability of joining them. (Thus making them all searchable, that's the ultimate aim.) I don't mind if the method/solution applies a single blank page at every document end either.

    Read the article

  • Script to copy files on CD and not on hard disk to a new directory

    - by John22
    I need to copy files from a set of CDs that have a lot of duplicate content, with each other, and with what's already on my hard disk. The file names of identical files are not the same, and are in sub-directories of different names. I want to copy non-duplicate files from the CD into a new directory on the hard disk. I don't care about the sub-directories - I will sort it out later - I just want the unique files. I can't find software to do that - see my post at SuperUser http://superuser.com/questions/129944/software-to-copy-non-duplicate-files-from-cd-dvd Someone at SuperUser suggested I write a script using GNU's "find" and the Win32 version of some checksum tools. I glanced at that, and have not done anything like that before. I'm hoping something exists that I can modify. I found a good program to delete duplicates, Duplicate Cleaner (it compares checksums), but it won't help me here, as I'd have to copy all the CDs to disk, and each is probably about 80% duplicates, and I don't have room to do that - I'd have to cycle through a few at a time copying everything, then turning around and deleting 80% of it, working the hard drive a lot. Thanks for any help.

    Read the article

  • Getting content with images into the Web Browser control without temporary files

    - by Revision17
    For a client server application, I'd like to display content with a web browser control with images without writing temporary files to the disk. I've tried using mht files via documentstream and documenttext, but the web browser control isn't smart enough to recognize mht files. I would use data URI images, however most computers this will be installed on use IE6 or 7. Are there any other options for this?

    Read the article

  • Can I grant permissions on files in windows 7 using a security identifier from another machine

    - by Thomas
    I have an external hard drive, and I wish to grant permissions on some files to users from 2 different computers without having to hook it up to the 2 different computers. I know the SID of the user on the other computer, I'd like to know if and how I can grant permissions to files using the SID. I'm running Windows 7 Professional 64 bits, and "The Other" computer Win 7 Home Premium 64 bits, they are not in a domain, but separate computers on a home network (not even same homegroup). Note: Duplicated question with: Is there a way to give NTFS file permissions to users from other Windows installations?

    Read the article

  • Get recursive list of svn:ignore'd files

    - by Joseph Mastey
    I have an existing project repo which I use for project A, and has some files and directories excluded from it using svn:ignore. I want to start another project (project B), in a new repo, with approximately the same files ignored in it. How can I get a list of all files in the repo with svn:ignore set on them and the value of that property? I am using Ubuntu, so sed and grep away if that helps. Thanks, Joe

    Read the article

  • nginx: php-fastcgi running but php files not executing

    - by Daniel
    I have recently set up a nginx server with PHP running as FastCGI process. The server is running with HTML files however PHP files are downloading instead of displaying and PHP code is not processed. This is what I have in nginx.conf: server { listen 80; server_name pubserver; location ~ \.php$ { root /usr/share/nginx/html; fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME /usr/share/nginx/html$fastcgi_script_name; include fastcgi_params; } } The command netstat -tulpn | grep :9000 displays the following which indicates php-fastcgi is running and listening on port 9000: tcp 0 0 127.0.0.1:9000 0.0.0.0:* LISTEN 2663/php-cgi If it's if any importance my server is running on CentOS 6 and I installed nginx and PHP using the repositories from The Fedora Project.

    Read the article

  • Allow selection of readonly files from SaveFileDialog?

    - by Andy Dent
    I'm using Microsoft.Win32.SaveFileDialog in an app where all files are saved as readonly but the user needs to be able to choose existing files. The existing files being replaced are renamed eg: blah.png becomes blah.png-0.bak, blah.png-1.bak and so on. Thus, the language for the OverwritePrompt is inappropriate - we are not allowing them to overwrite files - so I'm setting dlog.OverwritePrompt = false; The initial filenames for the dialog are generated based on document values so for them, it's easy - we rename the candidate file in advance and if the user cancels or chooses a different name, rename it back again. When I delivered the feature, testers swiftly complained because they wanted to repeatedly save files with names different from the agreed workflow (those goofy, playful guys!). I can't figure out a way to do this with the standard dialogs, in a way that will run safely on both XP (for now) and Windows 7. I was hoping to hook into the FileOK event but that is called after I get a warning dialog: |-----------------------------------------| | blah.png | | This file is set to read-only. | | Try again with a different file name. | |-----------------------------------------|

    Read the article

  • icacls in windows 7 does not give full permission to write files in root drive

    - by Menuta
    icacls in windows 7 does not give full permission to write files in root drive. We have a very old application based on Omnis7 that needs to create and read/write files on drive C: when running as a restricted user. In Windows XP to give this permission is quite trivial using cacls. cacls C:\ /G Everyone:(C) The equivalent icacls in Windows 7 will not work. icacls C:\ /Grant Everyone:(M) I have also tried the following. icacls C:\ /Grant Everyone:(F) icacls C:\ /Grant Domain\user:(F) trying to create file with a restricted user gives this C:\>copy nul text.txt Access is denied. 0 file(s) copied. After applying the icacls permissions above the result changes to this. C:\>copy nul text.txt A required privilege is not held by the client. 0 file(s) copied. Is this an issue with the way I am applying the permissions? Or is Window 7 being extremely strict?

    Read the article

  • Does Windows DFS always keep some files backlogged?

    - by Badger
    I have been monitoring our DFS backlog and I have noticed that is hasn't really dropped below 1000 or so files. I am assuming this means it needs to use more bandwidth. So starting last night I allowed it to use 512Kbps between 6pm and 4am, when it used to only get 128Kbps. I noticed a large drop, but it never went below about 1500 files. I just want to be sure my conclusion about needing to use more bandwidth is correct before I tell my boss about it. I have included a graph of the data showing my stats from yesterday afternoon and last night. DFS Backlog Graph

    Read the article

  • displaying .aspx and .aspx.cs files such that its not understood by the browser

    - by user287745
    i have to give the user the option to upload his own aspx and aspx.cs files on to the server, adjust the hyperlink to point to a page which would do the following display the aspx and aspx.cs files code onto the page without actually rendering the code the browser should not understand anything, and while reading the files to display them the method be such that nothing is processed on the server regarding the code within the files to prevent from unnecessary problems many user would try to cause. i have tried many ways of displaying it but it ends up on displaying the actual comments instead of the code. please suggest with links to examples how to achieve the above. please note main concentration is on asp.net and c# using vs08, so j script and ready-made tools be avoided if feasible thank you all

    Read the article

  • Using a script that uses Duplicity + S3 excluding large files

    - by Jason
    I'm trying to write an backup script that will exclude files over a certain size. If i run the script duplicity gives an error. However if I copy and paste the same command generated by the script everything works... Here is the script #!/bin/bash # Export some ENV variables so you don't have to type anything export AWS_ACCESS_KEY_ID="accesskey" export AWS_SECRET_ACCESS_KEY="secretaccesskey" export PASSPHRASE="password" SOURCE=/home/ DEST=s3+http://s3bucket GPG_KEY="gpgkey" # exclude files over 100MB exclude () { find /home/jason -size +100M \ | while read FILE; do echo -n " --exclude " echo -n \'**${FILE##/*/}\' | sed 's/\ /\\ /g' #Replace whitespace with "\ " done } echo "Using Command" echo "duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST" duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST # Reset the ENV variables. export AWS_ACCESS_KEY_ID= export AWS_SECRET_ACCESS_KEY= export PASSPHRASE= When the script is run I get the error; Command line error: Expected 2 args, got 6 Where am i going wrong??

    Read the article

  • Deploy Connectivity Platform Folder/Files

    - by Dave
    I'm using the Microsoft.SmartDevice.Connectivity features, but I do not know how to get the CoreCon Folder and files under "C:\Documents and Settings\All Users\Application Data\Microsoft\corecon\1.0\1033" deployed as part of an install project. The "new DatastoreManager(1033)" fails becouse users of my app do not have that folder and device xml files. Any Ideas of how to install or deploy those files? I assume I have those as part of an API I've installed, but can not pinpoint the source. Thanks

    Read the article

  • How to higly compress files

    - by Alfred James
    I want to learn how can we higly compress a file. I have downloaded files whose compressed size was 4 gb and after expanding they became 10GB file. So Which software I need to use for such high compression? Will I lose quality of the data by compression? If you direct me to Winrar or 7zip, please tell me how can I higly compress a file cuz I have tried compression with it but the size was reduced just by few MBs. I am compressing games like .exe files. Thanks

    Read the article

  • Delete specific files after installation using visual studio setup project

    - by Vadiklk
    I have this problem. I want to build an installer for my c# solution, that will be placed in a folder with other installation folders and files that are needed to be copied to the installed folder. So that is easy, I just copy them to the folder I create using the folder structure I want. Now, I want also to install another program and run a .exe file I've created to unzip some files for me. For that I need to copy 2 .exe files and 2 dlls (for the exes) to the folder to which I am installing and create 2 custom actions that will use them. That I've managed to do. After that I want to delete those 4 extra files, as the user does not need them and shouldn't even be aware they are there. How to do so? I couldn't find a way in the built in setup project preferences + I do not know how to make a custom installer class. A bonus question, is how to make the other installer (one of the .exe files is just a plain installer) install quietly to any path? I do not want the user to see an installer pop out of my program installer. Thanks!

    Read the article

  • Taking ownership of trustedinstaller files?

    - by P a u l
    vista32-sp1: I am unable to delete some files on my system that were installed with 'special permissions' by 'trustedinstaller'. I find the usual help suggestion to use 'takeown' is not working, all I get is access denied. I refuse to believe there isn't some way to delete these files, or that microsoft has finally acheived their perfect security filesystem. This is NOT a case of a file being locked by a process. If this is all it was, I could solve this by myself. I know there are some recommended unlocking programs and they might do some sort of file system trick, but I would like to know what my possible direct actions might be. If a 3rd party program can 'unlock' a file, I want to know the mechanism. But like I said 'takeown' at the command line is not working for this.

    Read the article

  • uploading zip files in codeigniter won't work

    - by krike
    I have created a helper that requires some parameters and should upload a file, the function works for images however not for zip files. I searched on google and even added a MY_upload.php - http://codeigniter.com/bug_tracker/bug/6780/ however I still have the problem so I used print_r to display the array of the uploaded files, the image is fine however the zip array is empty: Array ( [file_name] => [file_type] => [file_path] => [full_path] => [raw_name] => [orig_name] => [file_ext] => [file_size] => [is_image] => [image_width] => [image_height] => [image_type] => [image_size_str] => ) Array ( [file_name] => 2385b959279b5e3cd451fee54273512c.png [file_type] => image/png [file_path] => I:/wamp/www/e-commerce/sources/images/ [full_path] => I:/wamp/www/e-commerce/sources/images/2385b959279b5e3cd451fee54273512c.png [raw_name] => 2385b959279b5e3cd451fee54273512c [orig_name] => 1269770869_Art_Artdesigner.lv_.png [file_ext] => .png [file_size] => 15.43 [is_image] => 1 [image_width] => 113 [image_height] => 128 [image_type] => png [image_size_str] => width="113" height="128" ) this is the function helper function multiple_upload($name = 'userfile', $upload_dir = 'sources/images/', $allowed_types = 'gif|jpg|jpeg|jpe|png', $size) { $CI =& get_instance(); $config['upload_path'] = realpath($upload_dir); $config['allowed_types'] = $allowed_types; $config['max_size'] = $size; $config['overwrite'] = FALSE; $config['encrypt_name'] = TRUE; $ffiles = $CI->upload->data(); echo "<pre>"; print_r($ffiles); echo "</pre>"; $CI->upload->initialize($config); $errors = FALSE; if(!$CI->upload->do_upload($name))://I believe this is causing the problem but I'm new to codeigniter so no idea where to look for errors $errors = TRUE; else: // Build a file array from all uploaded files $files = $CI->upload->data(); endif; // There was errors, we have to delete the uploaded files if($errors): @unlink($files['full_path']); return false; else: return $files; endif; }//end of multiple_upload() and this is the code in my controller if(!$s_thumb = multiple_upload('small_thumb', 'sources/images/', 'gif|jpg|jpeg|jpe|png', 1024)): //http://www.matisse.net/bitcalc/ $data['feedback'] = '<div class="error">Could not upload the small thumbnail!</div>'; $error = TRUE; endif; if(!$main_file = multiple_upload('main_file', 'sources/items/', 'zip', 307200)): $data['feedback'] = '<div class="error">Could not upload the main file!</div>'; $error = TRUE; endif;

    Read the article

  • Problem with script that excludes large files using Duplicity and Amazon S3

    - by Jason
    I'm trying to write an backup script that will exclude files over a certain size. If i run the script duplicity gives an error. However if i copy and paste the same command generated by the script everything works... Here is the script #!/bin/bash # Export some ENV variables so you don't have to type anything export AWS_ACCESS_KEY_ID="accesskey" export AWS_SECRET_ACCESS_KEY="secretaccesskey" export PASSPHRASE="password" SOURCE=/home/ DEST=s3+http://s3bucket GPG_KEY="gpgkey" # exclude files over 100MB exclude () { find /home/jason -size +100M \ | while read FILE; do echo -n " --exclude " echo -n \'**${FILE##/*/}\' | sed 's/\ /\\ /g' #Replace whitespace with "\ " done } echo "Using Command" echo "duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST" duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST # Reset the ENV variables. export AWS_ACCESS_KEY_ID= export AWS_SECRET_ACCESS_KEY= export PASSPHRASE= When the script is run I get the error; Command line error: Expected 2 args, got 6 Where am i going wrong??

    Read the article

  • Backup script that excludes large files using Duplicity and Amazon S3

    - by Jason
    I'm trying to write an backup script that will exclude files over a certain size. My script gives the proper command, but when run within the script it outputs an an error. However if the same command is run manually everything works...??? Here is the script based on one easy found with google #!/bin/bash # Export some ENV variables so you don't have to type anything export AWS_ACCESS_KEY_ID="accesskey" export AWS_SECRET_ACCESS_KEY="secretaccesskey" export PASSPHRASE="password" SOURCE=/home/ DEST=s3+http://s3bucket GPG_KEY="7743E14E" # exclude files over 100MB exclude () { find /home/jason -size +100M \ | while read FILE; do echo -n " --exclude " echo -n \'**${FILE##/*/}\' | sed 's/\ /\\ /g' #Replace whitespace with "\ " done } echo "Using Command" echo "duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST" duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST # Reset the ENV variables. export AWS_ACCESS_KEY_ID= export AWS_SECRET_ACCESS_KEY= export PASSPHRASE= If run I recieve the error; Command line error: Expected 2 args, got 6 Enter 'duplicity --help' for help screen. Any help your could offer would be greatly appreciated.

    Read the article

  • Mercurial Messing Up csproj Files?

    - by alphadogg
    I am using Hg to manage and merge code with three other developers involved in a VS2008 project. We do have an .hgignore file that ignores a fair number of files not necessary to track, such as *.pdb, *.obj, etc. However, we do track .csproj files. Periodically, it would seem that files go missing after a merge. We would get build issues, and have to relocate files which were in the project folders, but not in the csproj file. Eventually, I noted during a merge conflict that sometimes Hg seems to merge incorrectly. Here's a screenshot below. The actual conflict that requires manual intervention is lower in the file. But in this section, hg incorrectly replaces DirectoryTasks.cs with a new, different file called ReportTasks.cs, when in fact, both should be added. How do people manage to avoid this?

    Read the article

  • Error when trying to access Shared files from iMac via smb

    - by SatheeshJM
    I used to access all my Windows XP shared files on my Mac using Finder -- Window -- Connect to server. Now all of a sudden, an error crops up when I try to connect. I get the error "There was a problem connecting to the server "192.168.1.*" The server may not exist or it is unavailable at this time. Check the server name or IP address, check your internet connection and then try again. How can I remove this error and access my shared files from my Mac? P.S my network connections is fine.

    Read the article

  • Deleting certain files sits at "preparing to recycle" on Windows 7?

    - by Rachel
    We recently setup one of our users with a brand new Windows 7 computer, however she is unable to delete certain files. With some testing, I found I cannot move, rename, or view properties of these files either. When trying to delete the file, it just sits at the "Preparing to recycle" popup, however the "from" section says "Discovering items..." Clicking "More Details" on the popup shows me that it can't find the file name or where it's recycling from: Other notes... All the affected files are .pdf files that get created via a scanner. Other pdf files are fine. Opening the files works fine. I can open the file, Save As a new file, and delete the new one just fine Trying to delete the file via command prompt just sits there Rebooting the computer will let me manipulate the files like normal, however this user is responsible for scanning hundreds of documents a day and I'd rather not have to tell her to reboot her computer to delete files. The user is part of the administrator group on the computer The Owner of the affected files is the user attrib of files is just A

    Read the article

  • Copy files between two windows machines on seperate domains

    - by Simon
    I need to copy several database backups between two computers. The source computer initiates the copy and is a Windows 2000 pc and is a member of domain1. The destination machine is running Windows Server 2000 and is a member of domain2. The machines are on separate networks physically connected via a firewall. The files are currently copied via ssh with http://sshwindows.sourceforge.net/ installed on the destination machine. There is no need to encrypt the contents during the copy, however the passwords should not be sent in the clear. I am looking for a way to copy the files without having to install a server on the destination. I specifically need help with how to set up the permissions and what ports would need to be opened on the firewall.

    Read the article

  • Executed PHP files are stale unitl "touched" (Symlinked NFS mount as web root)

    - by mmattax
    We have a PHP application that has 3 web servers (running Nginx and Apache). The web server's directory root are symlinked directories that point to an NFS mount. For example: web01 has an NFS mount at /data/webapp, which is symlinked to /home/webapp. Apache serves content from /home/webapp/www. We also use ACP for our PHP opcode cache. When we deploy code, we SCP an archive file to the NFS server and extract it. Since upgrading RedHat 6, when we deploy our code the webserver execute "stale" PHP files until touch is run on the PHP files. We thought that APC might be causing a problem, but the issue exists, even after clearing the opcode cache. Any ideas on how to diagnose why the stale PHP code is being executed?

    Read the article

  • Grand Central Strategy for Opening Multiple Files

    - by user276632
    I have a working implementation using Grand Central dispatch queues that (1) opens a file and computes an OpenSSL DSA hash on "queue1", (2) writing out the hash to a new "side car" file for later verification on "queue2". I would like to open multiple files at the same time, but based on some logic that doesn't "choke" the OS by having 100s of files open and exceeding the hard drive's sustainable output. Photo browsing applications such as iPhoto or Aperture seem to open multiple files and display them, so I'm assuming this can be done. I'm assuming the biggest limitation will be disk I/O, as the application can (in theory) read and write multiple files simultaneously. Any suggestions? TIA

    Read the article

< Previous Page | 115 116 117 118 119 120 121 122 123 124 125 126  | Next Page >