Search Results

Search found 49518 results on 1981 pages for 'configuration files'.

Page 463/1981 | < Previous Page | 459 460 461 462 463 464 465 466 467 468 469 470  | Next Page >

  • Speed up file access on home network

    - by kurasa
    I have 2 PCs (Windows 7 Ultimate) and a Mac running Windows 7 using vmware fusion on my home network tied together using WRN1000 NETGEAR Router On one of the PC's I have a set of file (MYOB .myo). These use a data source to access the data in the files. Operations (reading,writing) to the .myo on the PC which hosts the files is fine but the other 2 it is painfully slow/unreliable and I am wondering what I can do to speed this up. Some ideas I have are 1. Turn off the Windows firewall on all the windows installations on the home network 2. Buy another router. Specifically a router which I can connect a USB flash drive on the back where I can put the .myo files and all the PC can access the files from the USB flash drive on the router (does this speed things up?) Any advice greatly appreciated on how I can speed up this access to data

    Read the article

  • Screenflow file type convert to AVI?

    - by Dave
    I've got a couple of large files 2 - 3GB each which were of a training course where the instructor used Screenflow on the Mac to record all his keypresses. I'm currently on a PC.. Problem: how to convert from .screenflow (and associated .scc files) to AVI or something a PC can play? Problem2: If I borrow a Mac can I d/load http://www.telestream.net/screen-flow/overview.htm (which I think was the package) and convert the files?

    Read the article

  • How to configure a Linux kernel based on the modules currently in use?

    - by Carla
    Hello, I'm willing to build a minimal kernel with only the needed things for my machine; so I started by compiling the kernel from the ground up, using the default configuration and adding things that I know for sure I have (i.e.: Ethernet card, WiFi card, ...). But there are several other things not so easy to know about (i.e.: the watchdog timer) so I came across AutoKernConf which supposedly detects the hardware of the machine and generates a kernel configuration file with the settings for the found devices. The problem is it contained several settings repeated and even some which I don't have (I'm using a Dell laptop and one of the things it "found" was something of a Toshiba one). So I ended up building a kernel with the configuration that came out of the make allmodconfig command, which is a kernel with most of the things compiled as modules. Booting into that kernel and running lsmod I can see all of the kernel modules in use (the ones really needed) and I would like to know if there is a tool or some way for me to parse that list and convert it to the corresponding kernel configuration file. Or how to map each one with the appropriate options in the kernel so that I can manually set them. Thank you very much for your time.

    Read the article

  • Read floppy from OpenVMS machine

    - by Goyuix
    I have a floppy I need to read the contents from - unfortunately it was formatted and the data written on an OpenVMS server. I believe the floppy is formatted "Files-11" and I can see parts of the MFT [equivalent] and file contents through a hex editor, however I would love to be able to mount this and actually read the files off. Is there a Files-11 FUSE module or other kernel module I can install to read this format? Any standalone utilities that can understand a floppy image taken with dd?

    Read the article

  • Windows: How to add batch-script action to Right Click menu

    - by ervingsb
    I have a few programs that creates temp files or backup files or similar files that are not important. For example, GVim for Windows by default creates a backup file in filename.txt~. I sometimes need to clean up a dir and remove all these files. I have made a simple .bat file for this. However, it is cumbersome to have to start up cmd, navigate to the folder, run the script. Especially since this is a script that I would like to run often on various folders. And I do not want to copy the script to multiple folders, as this would be a maintenance nightmare. So, I was thinking, that the best solution would be to add a Right Click menu item that allows me to run the script. So that I can right click on a folder in Explorer and click Cleanup and then have my script run on this folder. So my question is: How do I add a right click menu action that runs a custom batch script?

    Read the article

  • Should this folder called Data be indexed?

    - by panny
    In the indexing options of Windows 7 there is a folder called Data which is excluded from indexing for the C:\ drive by default. Can someone confirm this, please? I was not able to locate that folder on my drive, nor include it in the search index. The difference in number of indexed files is unsatisfying: windows-7 native indexing service:377703 files on six drives; third party desktop search indexing service:698654 files on the same number of drives. Files in UA Control seem not being indexed without proper priviledges. How can this be circumvented?

    Read the article

  • command line find/replace help

    - by Chrisbloom7
    I've got a set of 5000+ files that I need to do a simple search and replace in. I have been doing it in a text editor (EditPlus) by opening 500 files at a time, doing a global search/replace, saving all, closing, etc. But, that's taking literally hours to do and it's boring and tedious and I already have done it once today and need to do it again because all the files got refreshed. Is there a way to do this via the Bash command line? Here's the details: Find onchange="document.location ='/products/view.html/view/'+this.value" Replace it with onchange="alert('Not implemented')" style="display: none" All of the files have a .HTM extension, but they are nested in several sub directories.

    Read the article

  • Windows Server 2008 R2 DFS Root Namespace Required?

    - by caleban
    I would prefer to set up our DFS such as: \domain.local\users \domain.local\customers \domain.local\support etc. Is this a problem? Do I need to instead set all of the above folders as targets under a root such as: \domain.local\files\users \domain.local\files\customers \domain.local\files\support Other than the path being shorter in the top example, which is what I would prefer, is there a difference in functionality in Windows DFS between the two examples shown? Thanks in advance.

    Read the article

  • rpmbuild gives seg fault

    - by Deepti Jain
    I am trying to build an rpm using the rpmbuild tool. I have source code which build binaries around 30 GB. This software for which I am making the rpm has dozens of executables. When I copy only the binaries of a single executable (Eg. init) my rpm builds successfully. But when I dump the entire build to the rpm, rpmbuild does everything but gives a seg fault in the end. Here is my spec file: # This is a sample spec file for wget %define _topdir /root/mywget %define name source %define release 1 %define version 1.12 %define _builddir /root/mywget/BUILD/glenlivet %define _buildrootdir /root/mywget/BUILDROOT %define _buildroot /root/mywget/BUILDROOT %define _sourcedir /root/mywget/SOURCES BuildRoot: %{_buildroot} Summary: GNU source License: GPL Name: %{name} Version: %{version} Release: %{release} Source: %{name}-%{version}.tar.gz Prefix: /usr Group: Development/Tools %description The GNU sample program downloads files from the Internet using the command-line. %prep %setup -q -n glenlivet %build cd %{_builddir} make all %install rm -rf %{_buildrootdir} mkdir -p %{_buildrootdir}/bin cp -p -r %{_builddir}/build/obj-x64/* %{_buildrootdir}/bin/ %files %defattr(-,root,root) /bin/* If I only copy some of the binaries (let say one utility and its dependent binaries) it works fine. But when I try to copy the entire build, I get a seg fault. I get the seg fault after rpmbuild has executed these sections: %prep %build %install rpmbuild also processes my source file. Processing files: source-1.12-1 Finding Provides: Finding Requires: Finding Supplements: Provides:...... Requires:...... Checking for unpackaged file(s):/ usr/lib/rpm/check-files /root/mywget/BUILDROOT Checking for unpackaged file(s):/ usr/lib/rpm/check-files /root/mywget/BUILDROOT Segmentation fault Any clue what wrong is going on or where does rpmbuild fails? Thanks in advance

    Read the article

  • Synchronising a remote folder with a local one.

    - by Workshop Alex
    I am using a network disk (that's connected to my router by USB) to store several data files. A simple .NET application that I've created is supposed to read and modify these data files. However, some security issues are preventing this application to access these files directly. (Actually, these have been built-in to my application on purpose since it's not going to support NAS disks.) Since this disk is shared with several computers, I just want to have a simple synchronisation method, which will copy the files to a local folder where3 my application can access them. And, once modified, it should send back the modified files to the NAS disk again. I have two options: 1) Build a second application to do my own synchronisation. 2) Find some build-in function inside Windows 7 Ultimate which can do this for me. Option 2 is preferred. Option 1 is something I can do easily, if need be. I don't need third-party tools. (Still, feel free to add some references to good tools, although I won't accept them as answers.) Basically, is this possible with Windows 7 and if so, how?

    Read the article

  • Audio splitting and noise removal on Windows

    - by pts
    My mother has about 100 hours of audio in a mix of MP3 and WAV files, the digitized versions of her vinyl records. Each file contains about 5 songs with a few seconds of (noisy) pause between them. My mother needs software for Windows XP with which she can listen to the files, find the gaps manually, split the files at the gaps found, reduce noise on each song, and export the songs to individual MP3 files. My mother has very limited software user skills and affinity, and she doesn't speak English. The simpler the software, the better for her, even if noise reduction is worse than with a more sophisticated, but more complicated software. I'd prefer free software, freeware or shareware (which can do all above). Please recommend something much simpler than Audacity. The software should guide the user through the process, always showing the next few available steps, and being intuitive in the sense that there are only a few allowed actions and it's obvious what they are and how to activate them. Which software would you recommend?

    Read the article

  • Reduce "Metafile" memory usage?

    - by Jay Conrod
    My work computer (Windows 7 64-bit) spends a lot of time swapping memory when I switch between programs. This surprises me since I have 4 GB of RAM, and the programs I use aren't particularly RAM hungry (Outlook, Emacs, p4win, Firefox, various build tools). I downloaded RAMMap, and it shows over a gigabyte of memory used by "Metafile". From the Sysinternals blog: Metafile is part of the system cache and consists of NTFS metadata. NTFS metadata includes the MFT as well as the other various NTFS metadata files. ... In the MFT each file attribute record takes 1k and each file has at least one attribute record. Add to this the other NTFS metadata files and you can see why the Metafile category can grow quite large on servers with lots of files. So I understand what the "Metafile" data is... I work on large builds comprising hundreds of thousands of files (none are that big, but they add up to several gigabytes). My question is how can I reduce the amount of memory used by "Metafile"? I'm not actively using all those files at once, so why does Windows need to keep info in RAM? Restarting my machine every time I sync a new build is really annoying.

    Read the article

  • Ram cache on Windows Server 2008

    - by Jonas Lincoln
    Scenario: We have a file cluster on a UNC share. A couple of IIS web servers serve files from this UNC share. This is done through a IIS-module, and this module does not use the built-in IIS-caching feature. We'd like to cache the files from the UNC share in a ram disk. So far, we've found this product: http://www.superspeed.com/servers/supercache.php Are there other products that can help us cache the files from the UNC-share in ram?

    Read the article

  • Discrepancy in file size on disk and ls output

    - by smokinguns
    I have a script that checks for gzipped file sizes greater than 1MB and outputs files along with their sizes as a report. This is the code: myReport=`ls -ltrh "$somePath" | egrep '\.gz$' | awk '{print $9,"=>",$5}'` # Count files that exceed 1MB oversizeFiles=`find "$somePath" -maxdepth 1 -size +1M -iname "*.gz" -print0 | xargs -0 ls -lh | wc -l` if [ $oversizeFiles -eq 0 ];then status="PASS" else status="CHECK FAILED. FOUND FILES GREATER THAN 1MB" fi echo -e $status"\n"$myReport The problem is that ls command outputs the files sizes as 1.0MB in the report but the status is "FAIL" as "$oversizeFiles" variable's value is 2. I checked the file sizes on disk and 2 files are 1.1MB. Why this discrepancy? How should I modify the script so that I can generate an accurate report? BTW, I'm on a Mac. Here is what man page for "find" says on my Mac OSX: -size n[ckMGTP] True if the file's size, rounded up, in 512-byte blocks is n. If n is followed by a c,then the primary is true if the file's size is n bytes (characters). Similarly if n is followed by a scale indicator then the file's size is compared to n scaled as: k kilobytes (1024 bytes) M megabytes (1024 kilobytes) G gigabytes (1024 megabytes) T terabytes (1024 gigabytes) P petabytes (1024 terabytes)

    Read the article

  • Get tortoisesvn to give me filenames with build number in the filename

    - by EricJLN
    I am on a Windows 7 box, and I have tortoisesvn on my machine. After getting a little familiar with svn and tortoisesvn on a code repository, I set up a local repository to manage revisions of some word and powerpoint documents. I want to figure out some scripted way to output a set of files with the build/revision number embedded in the filename. I will then email the files to some business people to review. For example, say I have a group of files in my working directory: PresentA.pptx PresentA-notes.docx PresentB.pptx and TortoiseSVN repo browser tells me that I am currently at revision 21 for PresentA.pptx and PresentA-notes.docx but at revision 25 for PresentB.pptx, I would like some way to get 3 files with the following names: PresentA-r21.pptx PresentA-notes-r21.docx PresentB-r25.pptx Alternatively, if revision 25 is the current value for the repository, having all the names appended with -r25 would work, too.

    Read the article

  • How do ulimit -n and /proc/sys/fs/file-max differ?

    - by bantic
    I notice that on a new CentOS image that I just booted up off of EC2 that the ulimit default is 1024 open files, but /proc/sys/fs/file-max is set at 761,408 and I'm wondering how these two limits work together. I'm guessing that ulimit -n is a per-user limit of number of file descriptors while /proc/sys/fs/file-max is system-wide? If that's the case, say I've logged in twice as the same user -- does each logged-in user have a 1024 limit on number of open files, or is it a limit of 1024 combined open files between each of those logged-in users? And is there much performance impact to setting your max file descriptors to a very high number, if your system isn't ever opening very many files?

    Read the article

  • I need a few minutes of dedicated server a week, but not for hosting, just to convert ogg etc

    - by talkingnews
    I'm completely happy with my webhosting, it's just that I need to do one little thing they won't allow, and that's run an instance of Sox to convert about 30 mp3s to ogg files, in various directories, a couple of times a week, to be done automatically in response to the detection of the upload of an mp3. Probably looking at a minute of server time over the whole week. I've had unhelpful suggestions on other forums like "why not leave your home PC on 24 hours a day and then use all your isp bandwidth to do this", which doesn't work for me. I know that I can host files on, say, Amazon S3, but is there something similar for my needs? All it would need to do would be: wget/ftp the mp3 files, convert them to ogg, ftp the files back to my hosting. Of course, all this wouldn't be needed if there was such a thing as a compiled binary of Sox (or any mp3ogg converter) for Centos which I could upload without needing root access, but I've given up asking that one, but always open to suggestions!

    Read the article

  • How to get data from a borked Windows Home Server

    - by harhoo
    Yesterday we had a power surge, followed by a power outage. This left my WHS borked: powering on just gives to a flashing blue light (the led on the power supply also flashes green) - no fan or boot activity, nothing. I urgently needed some files off there in the short term (and the 500GB of photos, music, personal video etc in the long term) so I took the hard drive out and put it in my computer. The files and folders showed up, but I couldn't access them - clicking on an image gave an invalid image error in Picasa, I couldn't play MP3s etc. I changed the ownership and permissions of the files, still nothing. I booted in with a LiveCD, the same: files appear, but won't open. Is there anything else I can do? I'm now wondering if it was just the power cable that's broken, but if so, why can;t I access my files from the hard drive? If it is the power cable, and I replace that and the hard drive, will I have done any harm messing around with ownership and file permissions?

    Read the article

  • is there a way to automate changing filenames in <link> , <script> tags

    - by nepsdotin
    when we use Expires header for text files like js, css, contents are cached in the browser, to get new content we need to change in the html file the new names in the link and script tag. When we add changes. How can we automate it. I may have some bunch of html files in multiple folders also in subdirectories. There would be a text file filelist.txt OldName NewName oldfile1-ver-1.0.js oldfile1-ver-2.0.js oldfile2-ver-1.0.js oldfile2-ver-2.0.js oldfile3-ver-1.0.js oldfile3-ver-2.0.js oldfile4-ver-1.0.js oldfile4-ver-2.0.js The script should change all the oldfile1-ver-1.0.js into oldfile1-ver-2.0.js in the html, php files I would run this script before i start uploading. Finally the script could create a list of files and line number where it made the update. The solution can be in PERL/PHP/BATCH or anything thats nice and elegant

    Read the article

  • backing up ntfs disk using rsync on ubuntu

    - by user70366
    For a long time I was using windows. I have a separate drive I use to keep copies of my media files, photos etc. on, which I periodically backup to an external drive. In Windows I used SyncToy to do this. After my Windows stopped booting, I decided to switch to Linux (Ubuntu 10.10). That seems to be going fine, but now I want to backup my drive to the external drive like before. Mostly the two drives will be already the same with maybe about 10GB of extra files added. So I try to use rsync to synchronise the two drives like this: rsync --dry-run -rvlt --modify-window=1 /media/Antonio1TB/Backup /media/FREECOM\ HDD/Backup The problem is the dry run indicates that every file on the drive will be copied. Not just the files I have recently added. What is the correct command to synch two NTFS drives under Ubuntu so that files that already exist don't get copied again? Thanks.

    Read the article

  • Got Hacked. Want to understand how.

    - by gaoshan88
    Someone has, for the second time, appended a chunk of javascript to a site I help run. This javascript hijacks Google adsense, inserting their own account number, and sticking ads all over. The code is always appended, always in one specific directory (one used by a third party ad program), affects a number of files in a number of directories inside this one ad dir (20 or so) and is inserted at roughly the same overnight time. The adsense account belongs to a Chinese website (located in a town not an hour from where I will be in China next month. Maybe I should go bust heads... kidding, sort of), btw... here is the info on the site: http://serversiders.com/fhr.com.cn So, how could they append text to these files? Is it related to the permissions set on the files (ranging from 755 to 644)? To the webserver user (it's on MediaTemple so it should be secure, yes?)? I mean, if you have a file that has permissions set to 777 I still can't just add code to it at will... how might they be doing this? Here is a sample of the actual code for your viewing pleasure (and as you can see... not much to it. The real trick is how they got it in there): <script type="text/javascript"><!-- google_ad_client = "pub-5465156513898836"; /* 728x90_as */ google_ad_slot = "4840387765"; google_ad_width = 728; google_ad_height = 90; //--> </script> <script type="text/javascript" src="http://pagead2.googlesyndication.com/pagead/show_ads.js"> </script> Since a number of folks have mentioned it, here is what I have checked (and by checked I mean I looked around the time the files were modified for any weirdness and I grepped the files for POST statements and directory traversals: access_log (nothing around the time except normal (i.e. excessive) msn bot traffic) error_log (nothing but the usual file does not exist errors for innocuous looking files) ssl_log (nothing but the usual) messages_log (no FTP access in here except for me)

    Read the article

  • Windows 7: L10N mechanics

    - by John Sonderson
    I have a localized version of Windows 7. I can't figure out where windows gets the names for files and directories on the system. For instance, consider the following (default) files. > cd C:\Users\Public\Pictures\Sample Pictures > dir Chrysanthemum.jpg Desert.jpg ... When I view these files on the default file explorer I see these names: Crisantemo.jpg Deserto.jpg ... This seems to imply that each file can be somehow assigned a localized name somewhere. However I cannot figure out how. Would appreciate if someone could shed some light on this issue. Thanks. UPDATE EDIT: The desktop.ini file in the folder containing Chrysanthemum.jpg contains the following entries. The .dll files used to translate the various resources are unfortunately not human-readable and I have no clue as to how they could be generated for other files created by the user to be translated, but they serve the purpose, and solve the mystery which lead to the post. Thanks. [LocalizedFileNames] Chrysanthemum.jpg=@%systemroot%\system32\SampleRes.dll,-101 Desert.jpg=@%systemroot%\system32\SampleRes.dll,-102 Hydrangeas.jpg=@%systemroot%\system32\SampleRes.dll,-103 Jellyfish.jpg=@%systemroot%\system32\SampleRes.dll,-104 Koala.jpg=@%systemroot%\system32\SampleRes.dll,-105 Tulips.jpg=@%systemroot%\system32\SampleRes.dll,-106 Lighthouse.jpg=@%systemroot%\system32\SampleRes.dll,-107 Penguins.jpg=@%systemroot%\system32\SampleRes.dll,-108 [.ShellClassInfo] LocalizedResourceName=@%SystemRoot%\system32\shell32.dll,-21805

    Read the article

  • Why is MySQL unable to open hosts.allow/hosts.deny?

    - by HonoredMule
    I have a storage server running Nexenta (OpenSolaris kernel, Ubuntu userspace) with MySQL on top of a ZFS storage array, using innodb_file_per_table and ulimit -n set to 8K. mysqltuner.pl confirms the file limit and claims there are 169 files. The following command: pfiles `fuser -c / 2>/dev/null indicates one mysqld process having 485 file/device descriptors (and they're almost all for files) so I don't know how reliable the tuning script is, but it is still way less than 8K and this list also finds no other process which is close to it's limit. The global total number of descriptors in use is around 1K. So what can cause mysqld to be constantly streaming the following errors? [date] [host] mysqld[pid]: warning: cannot open /etc/hosts.allow: Too many open files [date] [host] mysqld[pid]: warning: cannot open /etc/hosts.deny: Too many open files Everything appears to actually be operating fine, but the issue is constantly flooding the admin console and starts right away on a fresh boot (not only reproducible, but always from mysqld and always the hosts files, whose permissions are the default -rw-r--r-- 1 root root). I could, of course, suppress it from the admin console but I'd rather get to the bottom of it and still allow mysqld warnings/errors to reach the admin console. EDIT: not only is the actual file descriptor well within sane limits, the issue also persists (with immediate appearance) even with the file limit raised to 65535 and always only on hosts.allow/deny.

    Read the article

< Previous Page | 459 460 461 462 463 464 465 466 467 468 469 470  | Next Page >