Search Results

Search found 60836 results on 2434 pages for 'system io directory'.

Page 1800/2434 | < Previous Page | 1796 1797 1798 1799 1800 1801 1802 1803 1804 1805 1806 1807  | Next Page >

  • disable mystery programs running at startup

    - by pstanton
    Hi and sorry for the ambiguous title... I have a few programs that should run at startup which are 'properly' configured to do so via adding shortcuts to the startup directory: C:\Users\[me]\AppData\Roaming\Microsoft\Windows\Start Menu\Programs\Startup However I have (at least) 4 programs which are also starting up, which I can't find where they are configured or how to disable them. I have tried to find them in the above folder, as well as in the 'startup' section of 'msconfig'. The programs include: Skype (for which I have disabled 'start when windows starts' in its options) Thunderbird (for which I cannot find any option to run-at-startup) Task manager (as above) and some anonymous call to javaw (can't find any more details but it fails anyway) The other strange thing is that it seems like these (at least skype and thunderbird) are running 'as administrator' ... i have deduced this because I am unable to use the file-drag-and-drop feature in both (which is a known problem when running 'as administrator'). If someone could guide me to where these extra programs are configured to run-at-startup I would be very greatful! ps. my user account has the administrator role. EDIT: preferrably without another 3rd party tool...

    Read the article

  • Sharing files between centOS on virtualbox and windows 7 as the host

    - by Wasswa Samuel
    I have centOS 5.5 installed on virtual box it has no GUI so every thing is command based. I want to make a folder in centOS which i can share with my windows 7 host OS such that i can send files to and fro seamlessly. I am new to linux and i managed to install samba. I looked up some tips on net but i ended up getting confused and none of them worked. Can someone explain to me how i can do this in a straight forward way from how i can configure samba to how i can mount the folder such that it can be seen on the host operating system. I am completely lost.please help.

    Read the article

  • Moving cpanel backup of magento site to VPS

    - by user2564024
    I was having my site in shared hosting, I took the entire backup, its structure is like addons homedir mysql resellerpackages suspendinfo bandwidth homedir_paths mysql.sql sds userconfig counters httpfiles mysql-timestamps sds2 userdata cp locale nobodyfiles shadow va cron logaholic pds shell vad digestshadow logs proftpdpasswd ssl version dnszones meta psql sslcerts vf domainkeys mm quota ssldomain fp mma resellerconfig sslkeys has_sslstorage mms resellerfeatures suspended Now I have subscribed to vps, I have copied the files inside homedir/public_html to var/www/html of my new hosting, but am seeing the following error when I view it browser, There has been an error processing your request Exception printing is disabled by default for security reasons. Error log record number: 259343920016 I have just created database with name magenhto inside mysql. Previously I had cpanel and used one click installer. Hence am not aware of how to use that data inside mysql to this new system and are there any more changes.

    Read the article

  • What could be causing MsiInstaller to continuously reconfigure applications(EventID 1035)?

    - by user7862
    I have a brand-new machine that we just installed Windows Server 2008 Enterprise on about two months ago. In the event log, I am seeing thousands of EventID 1035 logged. This is MsiInstaller reconfiguring about a dozen products over and over, looping about every half-hour. Has anyone seen this? As a beginning, I did a general web search, and most solutions revolved around Dell System Center or Google toolbar being installed as the culprit. We have neither of those products installed. Thanks for your help, Dale

    Read the article

  • Password History Storage and Variability Comparison

    - by z3ke
    I believe this situation would be similar to many others out there, so maybe some of you can shed some light... Supposedly, when making password changes through MS exchange every 90 days, you cannot use any simple variation of one of your old passwords, up to whatever limit the admin's set for a system. My question: If your previous passwords are only stored as hashes, how can they check for the "just changed one letter" case. Wouldn't they have to have access to the old plain-text passwords in order to make those comparisons? The only other thing I can think of is if upon original creation of a password, they also stored all other one character permutations of it, so that they can be banned later?

    Read the article

  • Using Dropbox API instead of a FTP server.

    - by Somebody still uses you MS-DOS
    This is a small aplication scenario. Usually, when you have to do some backups of source code/database on your server, you use a second ftp server, a cronjob to tar.gz your db dumps and source files, and send this file to your ftp server from your application server. Dropbox created an API to use it's infrastrucutre. Since they provide 2gb for free accounts, I thought about being able to upload to it instead of a ftp server. So, if you do some freelance work, you can create a free account for each client and use this approach, maybe encrypting the files you send. You even gain a revision for each sent file, like a revison control system, for free, from the last 30 days. What do you think of this approach? Is it possible? And, more importantly: what are the security risks involved? (That's why I'm asking this on serverfault, since this POV from sysadmins will be more accurate). Thanks!

    Read the article

  • Sharing RAM resources between 2 or more computers

    - by davee44
    I know there was a somewhat similar question before: How to share CPU or RAM? But, let me just specify it a little more... When Microsoft Windows requires more RAM capacity than available it uses a swap-file to temporarily store the data there, this is actually something like a hard-drive-based RAM. This technology is used for many years. Theoretically, it shouldn't be too hard to implement a similar technology that uses the RAM of different computer(s) in the network for temporary data storage. This just requires a software that runs on computers in the network that accepts and returns data from/to the main computer and keep that data in the RAM; plus the operation system of the main computer must have the ability to use computers in the network instead of (or in addition to) the swap-file. I wonder, are there any implementations of this idea? This would allow users to build RAM clusters using all of their home or office computers, that will boost the performance of a single computer for some development/gaming/video tasks, etc.

    Read the article

  • How to delete massive files via ftp or ssh?

    - by spotlightsnap
    On my servers, one of the scripts that I have been using keeps creating the blank files at root and I haven't been noticed for more than over 6 months and now total files are created more than 500,000 files. I cannot access that directory through control panel because there were too many files and I can only access with ftp. Even with ftp, ftp truncated the files by 8000 each. So I have to keep deleting 8000 each. I tried to ask my host to delete it for me but they says they can't since it's the liability issues. So what I want to know is how can i delete all of those 500,000 files through ftp? Since it's shared hosting, I don't have SSH access either. Hosting provider says I can request the SSH access but need to verify it and their office closed until next week. So I am stuck with ftp for now. So please kindly let me know how can i delete massive files via ftp ? And incase, if i can get the ssh access, please kindly let me know how can i delete the files via ssh with efficient ways ? Filename are like this closecp.139619 closecp.139619.1 closecp.139620 closecp.139620.1 Thank you.

    Read the article

  • Data take on with Drupal 6

    - by Robert MacLean
    We are migrating our current intranet to Drupal 6 and there is a lot of data within the current system which can be classified into: List data, general lists of fields. Common use is phone list of the employees phone numbers. Document repository. Just basically a web version of a file share for documents. I can easily get the data + meta infomation out, but how do I bulk upload the two types of data into Drupal, as uploading the hundred of thousands of items manually is just not acceptable.

    Read the article

  • On Linux/Unix, does .tar.gz versus .zip matter?

    - by rwallace
    Cross-platform programs are sometimes distributed as .tar.gz for the Unix version and .zip for the Windows version. This makes sense when the contents of each must be different. If, however, the contents are going to be the same, it would be simpler to just have one download. Windows prefers .zip because that's the format it can handle out of the box. Does it matter on Unix? That is, I tried today unzipping a file on Ubuntu Linux, and it worked fine; is there any problem with this on any current Unix-like operating system, or is it okay to just provide a .zip file across the board?

    Read the article

  • How to use ccache selectively?

    - by Anonymous
    I have to compile multiple versions of an app written in C++ and I think to use ccache for speeding up the process. ccache howtos have examples which suggest to create symlinks named gcc, g++ etc and make sure they appear in PATH before the original gcc binaries, so ccache is used instead. So far so good, but I'd like to use ccache only when compiling this particular app, not always. Of course, I can write a shell script that will try to create these symlinks every time I want to compile the app and will delete them when the app is compiled. But this looks like filesystem abuse to me. Are there better ways to use ccache selectively, not always? For compilation of a single source code file, I could just manually call ccache instead of gcc and be done, but I have to deal with a complex app that uses an automated build system for multiple source code files.

    Read the article

  • Searching for just files

    - by M Schenkel
    I have a couple questions about searching for files on Windows 7. I find the XP method much easier than this new Windows 7 search. Note: I am only concerned about finding files matching a search term, not ALL files containing the search term. Is there a way to search just for files? When I use the search it seems to be searching "within" files and returning instances where the name of the file is used. Example: I have a whole web directory and want to find the javascript files. But if I enter "myjavascript.js" in the search, it also returns all the html files which reference the javascript file. This is both slow and difficult to actually find the reference to the file. Is there a way to search for an exact match? The search seems to implicitly use wildcards. For instance, say I have a bunch of files in a folder: file1.txt,file11.txt, file12.txt, file13.txt. If I enter "file1.txt" in the searcher it returns instances as if I were using a wild card file1*.txt I miss XP!!!!

    Read the article

  • Source File not updating Destination Files in Excel

    - by user127105
    I have one source file that holds all my input costs. I then have 30 to 40 destination files (costing sheets) that use links to data in this source file for their various formulae. I was sure when I started this system that any changes I made to the source file, including the insertion of new rows and columns was updated automatically by the destination files, such that the formula always pulled the correct input costs. Now all of a sudden if my destination files are closed and I change the structure of the source file by adding rows - the destination files go haywire? They pick up changes to their linked cells, but don't pick up changes to the source sheet that have shifted their relative positions in the sheet. Do I really need to open all 40 destination files at the same time I alter the source file structure? Further info: all the destination files are protected, and I am working on DropBox.

    Read the article

  • Getting a Non-Genuine windows message on a Genuin Windows 7

    - by user36257
    I have a Genuin Win7 enterprise on my Laptop. A few hours ago when I wanted to log into windows it did not accept my Password. I used the safe mode and it accepted the password I was using before this new password. It is the laptop for work and we have a changing password policy for every three months, so the pasword that I could use in SAFE MODE was the password I had for the last previous threee months. ... after that I used SYSTEM RESTORE and it reveretd it back to Yesterday ... so this time I could loging successfully with my current password. BUT It shows me a message that I am a victim of software counterfeiting and when I restarted the windows again and logged into windows, this time it is just a black desktop. weird...any ideas?

    Read the article

  • writting becomes slow after few writes

    - by user1566277
    I am running an embedded Linux on arm with a SD-Card. While writing huge amounts of data I see bizarre effects. E.g, when I dd a 15 MB file few times, it writes the file (normally) in less than 2 Secs. But After lets say 3-4 times it takes sometimes 15 to 30 Seconds to write the same file. If I sync after writing the file, then this does not happen but sync takes long time too. If there is enough gap between writing two files than presumably kernel syncs itself. How can I optimize the whole performance so that write should always finish inside 2 Seconds. The File system I am using is ext3. Any pointers?

    Read the article

  • Which Microsoft server applications are compatible with Windows Server 2012? [closed]

    - by Massimo
    As mush as I personally find the new user interface to be absolutely awful (and even more so on a server O.S.), we'll soon have to put up with Windows Server 2012 (formerly Windows Server 8). So, let's start with the basics: which Microsoft products do actually run on it? I've been looking around for a compatibility chart for a while, but couldn't find one. On the requirements/support pages for various Microsoft products (even for the latest releases), Windows Server 2012 is never mentioned at all. So, what about... SQL Server Exchange Lync SharePoint System Center (CM, OM, DPM, VMM...) And so on?

    Read the article

  • I can see markup characters in vim `:help`

    - by Relax
    I just created a .txt file inside .vim/doc for documenting one little function of my .vimrc, ran :helptags ~/.vim/doc and apparently the whole vim help system went wild. Now, if I open for example :help help, I see things like: This also works together with other characters, for example to find help for CTRL-V in Insert mode: > :help i^V < (notice the < and characters). I can also see the ~ at the end of headlines and the modeline at the end of the help page (thinks like vim:tw=78:ts=8:ft=help:norl:). I have no idea about what happens or how to fix it. Any clue? Thanks in advance!

    Read the article

  • Reading log files from web application

    - by Egorinsk
    Hi! I want to write a small PHP application for monitoring logs on a Debian server, including syslog logs and Apache/PHP messages. The problem here is that Apache user (www-data) has no access to /var/log directory. What would be the best way to grant an access to logs for PHP application? Let's assume that log files can be really large, like hundreds of megabytes. I have some ideas: Write a shell script that would be run via sudo and tail last 512 Kb of log into a separate file that can be read by application - that's ineffective, because of forking a new process and having to read data twice Add www-data to adm group (that can read logs) - that's insecure Start a PHP process via cron every minute to read logs — that's not very good, because it doesn't allow real-time monitoring. Also, this script will be started even when I don't read logs, and consume CPU time (server is in the cloud, and I'll have to pay for it) Create a hardlink for all log files with lowered permissions - I guess, that won't work because logrotate could recreate log files and they'll change inode number. Start a separate nginx/Apache server under privileged user that may read logs. Maybe anyone got a better solution?

    Read the article

  • time on files differ by 1 sec. FAIL Robocopy sync

    - by csmba
    I am trying to use Robocopy to sync (/IMG) a folder on my PC and a shared network drive. The problem is that the file attributes differ by 1 sec on both locations (creation,modified and access). So every time I run robocopy, it syncs the file again... BTW, problem is the same if I delete the target file and robocopy it from new... still, new file has 1 sec different properties. Env Details: Source: Win 7 64 bit Target: WD My Book World Edition NAS 1TB which takes its time from online NTP pool.ntp.org (I don't know if file system is FAT or not)

    Read the article

  • Option and command keys in Mac OS X are swapped and keyboard preferences do not set them back.

    - by bikesandcode
    On my MacBook Pro, I occasionally use external keyboards, generally Windows ones and things have been fine. Yesterday, I plugged in a new one, remapped the command/option keys so the windows/alt keys were in the same configuration, again, nothing new here. However, this time when I unplugged the USB keyboard, the laptops option/command keys remained switched. More annoying is that if I go into the System Preferences - Keyboards - Modifier keys, remapping the keys to actions does not work. I can use the drop downs to disable any specific keys, but switching the behaviours does nothing. (Cmd/Option obvious, tried remapping anything to caps lock and a few other combinations, no joy. Restore defaults set the configuration to what I'd expect, but the settings are evidently ignored.) So: Any ideas?

    Read the article

  • Nginx Removes the index.php from URL

    - by codeHead
    I have a codeigniter php application on nginx. It works as expected on Apache but after moving to nginx, I noticed that the index.php is automatically removed from the URL in all my links. Infact when I try using index.php it does not go to the desired URL but gets redirected to my default controller. below is a coopy of my nginx.conf file. server{ listen 80; server_name mydomainname.com; root /var/www/domain/current; # index index.php; error_log /var/log/nginx/error.log; access_log /var/log/nginx/access.log main; location / { # Check if a file or directory index file exists, else route it to index.php. try_files $uri $uri/ /index.php ; } location ~* \.php { fastcgi_pass backend; include fastcgi.conf; fastcgi_buffer_size 128k; fastcgi_buffers 4 256k; fastcgi_busy_buffers_size 256k; fastcgi_read_timeout 500; #fastcgi_param SCRIPT_FILENAME $document_root/index.php; add_header Expires "Thu, 01 Jan 1970 00:00:01 GMT"; add_header Cache-Control "no-cache, no-store, private, proxy-revalidate, must-revalidate, post-check=0, pre-check=0"; add_header Pragma no-cache; add_header X-Served-By $hostname; } location ~* ^.+\.(css|js)$ { expires 7d; add_header Pragma public; add_header Cache-Control "public"; } # set expiration of assets to MAX for caching location ~* \.(ico|gif|jpe?g|png)(\?[0-9]+)?$ { expires max; log_not_found on; } } I need to use my URL With the index.php -- please help.

    Read the article

  • How to know when the Client is disconnected from Server in C#?

    - by menacheb
    Hi, I have a Client-Server program in C#. Here is the Server's code: ... String dataFromClient = ""; NetworkStream networkStream; TcpClient clientSocket; bool transfer = true; ... while (transfer) { networkStream = clientSocket.GetStream(); networkStream.Read(bytesFrom, 0, (int)clientSocket.ReceiveBufferSize); dataFromClient = System.Text.Encoding.ASCII.GetString(bytesFrom); dataFromClient = dataFromClient.Substring(0, dataFromClient.IndexOf("$")); .... } I want to make a condition that stop the loop when the Client is disconnected. How can I do that? Many thanks,

    Read the article

  • Getting PAM/user info into php - something like Net_Finger instead of a db?

    - by digitaltoast
    I've got a very small user group who just need to login, upload, check and then move specific files to a different area when ready. Right now, I use the nginx PAM auth module to log them in against their unix accounts. As their login is their home directory, I've already got the info to send the uploads to the right area - one line of php and no database needed. But I'm maintaining a separate DB just so PHP can welcome them, grab their email and send them an email when processed. Yes, sure I could use nosql or sqlite instead so as to not need a whole mysql install. But it occurred to me that as I've got all these blank user fields for phone numbers I could populate with any data, that I could use something like php's Net_Finger. Which failed for me with: sudo pear install Net_Finger Starting to download Net_Finger-1.0.1.tgz (1,618 bytes) ....done: 1,618 bytes could not extract the package.xml file from "/build/buildd/php5-5.5.9+dfsg/pear-build-download/Net_Finger-1.0.1.tgz" Download of "pear/Net_Finger" succeeded, but it is not a valid package archive Error: cannot download "pear/Net_Finger" At which point I thought I'd stop, and take a ServerFault reality check - is this a really bad/dangerous/stupid idea just to stop me having to maintain details in two places rather than one? It there a better way? Googling shows that it's not an oft-asked thing, so perhaps with good reason?

    Read the article

  • Internet Explorer8 on Windows 7 - file download results in "C:\ location not accessible."

    - by Soulhuntre
    Ok, an odd problem. For one user on a Windows 7 x64 system, attempting to download a file with Internet Explorer 8 results in a error "C:\ location not accessible. Access is denied." even though the target location is not on the C:\ drive. No other users have the problem and no other browsers do. The user does not remember deleting any folders recently, and it looks like no new software has been installed. A checkdisk comes up clean. Any ideas? I grabbed some information via Process Monitor that may help with seeing the problem:

    Read the article

  • What questions do I need to ask for a database sync?

    - by user65745
    I am currently helping to implement an RFID inventory management system for my company. The software that we are locked into has been at best buggy and unreliable. The software provider is now rolling out a major release. My problem is that the new software release keeps a local database on each machine that then syncs to a master database online. According to the software company we cannot do a scaled rollout because of data corruption issues between the software releases. What questions can I be asking and what sort of testing can I do on my end to make sure this software works? Any suggestions would be very helpful.

    Read the article

< Previous Page | 1796 1797 1798 1799 1800 1801 1802 1803 1804 1805 1806 1807  | Next Page >