Search Results

Search found 69973 results on 2799 pages for 'file comparison'.

Page 25/2799 | < Previous Page | 21 22 23 24 25 26 27 28 29 30 31 32  | Next Page >

  • Ruby: Is there a better way to iterate over multiple (big) files?

    - by zxcvbnm
    Here's what I'm doing (sorry for the variable names, I'm not using those in my code): File.open("out_file_1.txt", "w") do |out_1| File.open("out_file_2.txt", "w") do |out_2| File.open_and_process("in_file_1.txt", "r") do |in_1| File.open_and_process("in_file_2.txt", "r") do |in_2| while line_1 = in_1.gets do line_2 = in_2.gets #input files have the same number of lines #process data and output to files end end end end end The open_and_process method is just to open the file and close it once it's done. It's taken from the pickaxe book. Anyway, the main problem is that the code is nested too deeply. I can't load all the files' contents into memory, so I have to iterate line by line. Is there a better way to do this? Or at least prettify it?

    Read the article

  • Untrusted file not showing unblock button windows 7

    - by Stewart Griffin
    I downloaded a dll but cannot use it as it is considered untrusted. I opened it using: Notepad.exe filepath\filename:zone.identifier and it informed me that that the file was in zone 3. Despite this I do not get an unblock button in the properties page for the file. Not being able to unblock it with this button I instead changed the value in notepad and saved my changes. When I reopen the zone.identifier info it is as I left it. I have set it to both 2 (trusted) and 0 (no information), but still am unable to use the files. Any one have any ideas? If I cannot unblock the files I will investigate turning this blocking off, but as a first step I'd like to try and just unblock this one file. Note: using Windows 7 Ultimate edition. It is when using MSTest from within Visual Studio 2008 that I hit problems.

    Read the article

  • How to Configure Windows Machine to Allow File Sharing with DNS Alias

    - by Michael Ferrante
    I have not seen a single article posted anywhere online that brings together all the settings one would need to do to make this work properly on Windows, so I thought I would post it here. To facilitate failover schemes, a common technique is to use DNS CNAME records (DNS Aliases) for different machine roles. Then instead of changing the Windows computername of the actual machine name, one can switch a DNS record to point to a new host. This can work on Microsoft Windows machines, but to make it work with file sharing the following configuration steps need to be taken. Outline The Problem The Solution Allowing other machines to use filesharing via the DNS Alias (DisableStrictNameChecking) Allowing server machine to use filesharing with itself via the DNS Alias (BackConnectionHostNames) Providing browse capabilities for multiple NetBIOS names (OptionalNames) Register the Kerberos service principal names (SPNs) for other Windows functions like Printing (setspn) References 1. The Problem On Windows machines, file sharing can work via the computer name, with or without full qualification, or by the IP Address. By default, however, filesharing will not work with arbitrary DNS aliases. To enable filesharing and other Windows services to work with DNS aliases, you must make registry changes as detailed below and reboot the machine. 2. The Solution Allowing other machines to use filesharing via the DNS Alias (DisableStrictNameChecking) This change alone will allow other machines on the network to connect to the machine using any arbitrary hostname. (However this change will not allow a machine to connect to itself via a hostname, see BackConnectionHostNames below). Edit the registry key HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\lanmanserver\parameters and add a value DisableStrictNameChecking of type DWORD set to 1. Allowing server machine to use filesharing with itself via the DNS Alias (BackConnectionHostNames) This change is necessary for a DNS alias to work with filesharing from a machine to find itself. This creates the Local Security Authority host names that can be referenced in an NTLM authentication request. To do this, follow these steps for all the nodes on the client computer: To the registry subkey HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Lsa\MSV1_0, add new Multi-String Value BackConnectionHostNames In the Value data box, type the CNAME or the DNS alias, that is used for the local shares on the computer, and then click OK. Note: Type each host name on a separate line. Providing browse capabilities for multiple NetBIOS names (OptionalNames) Allows ability to see the network alias in the network browse list. Edit the registry key HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\lanmanserver\parameters and add a value OptionalNames of type Multi-String Add in a newline delimited list of names that should be registered under the NetBIOS browse entries Names should match NetBIOS conventions (i.e. not FQDN, just hostname) Register the Kerberos service principal names (SPNs) for other Windows functions like Printing (setspn) NOTE: Should not need to do this for basic functions to work, documented here for completeness. We had one situation in which the DNS alias was not working because there was an old SPN record interfering, so if other steps aren't working check if there are any stray SPN records. You must register the Kerberos service principal names (SPNs), the host name, and the fully-qualified domain name (FQDN) for all the new DNS alias (CNAME) records. If you do not do this, a Kerberos ticket request for a DNS alias (CNAME) record may fail and return the error code KDC_ERR_S_SPRINCIPAL_UNKNOWN. To view the Kerberos SPNs for the new DNS alias records, use the Setspn command-line tool (setspn.exe). The Setspn tool is included in Windows Server 2003 Support Tools. You can install Windows Server 2003 Support Tools from the Support\Tools folder of the Windows Server 2003 startup disk. How to use the tool to list all records for a computername: setspn -L computername To register the SPN for the DNS alias (CNAME) records, use the Setspn tool with the following syntax: setspn -A host/your_ALIAS_name computername setspn -A host/your_ALIAS_name.company.com computername 3. References All the Microsoft references work via: http://support.microsoft.com/kb/ Connecting to SMB share on a Windows 2000-based computer or a Windows Server 2003-based computer may not work with an alias name Covers the basics of making file sharing work properly with DNS alias records from other computers to the server computer. KB281308 Error message when you try to access a server locally by using its FQDN or its CNAME alias after you install Windows Server 2003 Service Pack 1: "Access denied" or "No network provider accepted the given network path" Covers how to make the DNS alias work with file sharing from the file server itself. KB926642 How to consolidate print servers by using DNS alias (CNAME) records in Windows Server 2003 and in Windows 2000 Server Covers more complex scenarios in which records in Active Directory may need to be updated for certain services to work properly and for browsing for such services to work properly, how to register the Kerberos service principal names (SPNs). KB870911 Distributed File System update to support consolidation roots in Windows Server 2003 Covers even more complex scenarios with DFS (discusses OptionalNames). KB829885

    Read the article

  • sudoers file cleanup and consolidation tool/script

    - by Prashanth Sundaram
    Hello All, I am curious to know what other folks out there might be using to keep the sudoers file in a sane manner. I am looking for a tool, that removes redundant entries, overlapping permissions and/or present sudoers file in a organized way(like sorting by permissions/users/Aliases). I use SVN and Confi Mgmt. tool to version control and deploy resp. Is there any add-on/plugin you would recommend/use? User_Alias RT1123 jappleseed, sjobs Host_Alias HOST_RT1123 wdc101.domain.com, wdc104.domain.com Cmnd_Alias ..... Our sudoers file is simple but a lot of entries and it needs to be cleaned up. Does anyone know/have a tool/script to fix/present it ? Thanks!

    Read the article

  • Can't write to file - 'Operation not permitted' WITH sudo

    - by charliehorse55
    I am having trouble writing to a few files on an external HD. I am using it to store media files as well as my time machine backup. The drive is formatted as HFS+ Journaled, and other files on the drive can be written successfully. Additionally, the time machine backup is working perfectly. Permissions for the file: $ ls -le -@ Parks\ and\ Recreation\ -\ S01E01.avi -rw-rw-rw-@ 1 evantandersen staff 182950496 22 May 2009 Parks and Recreation - S01E01.avi com.apple.FinderInfo 32 Things I have already tried: sudo chflags -N sudo chown myusername sudo chown 666 sudo chgrp staff Checked that the file is not locked (get info in finder) Why can't I modify that file? Even with sudo I can't modify it at all.

    Read the article

  • cmd.exe version comparison?

    - by Paul
    When using batch files or console applications on Windows servers the window in question can allow text to be hightlighted (marked) for copying and pasting. Doing this pauses the batch/application and it will only resume after the copy operation. Or this is what I thought to be true. Recently on a Windows 2003 R2 SP2 server I noted that whilst the scrolling was paused the operations were not. Does anyone know if my description in the 1st para is true for older windows is not true for Windows 2003 R2 SP2 when it changed a full version comparison table for cmd.exe across different OS' ? Thanks for reading (Windows 2000 tag as that was the OS I used most before 2003 R2)

    Read the article

  • File association for editing on a mac

    - by Agos
    I'm quite experienced with how file association works for opening files on Mac OS X. I recall reading somewhere that OS X keeps not only the information about which apps can open a file, but also which apps can edit a specific file type. I'm having problems with those applications (Coda, Espresso, Forklift, Flow) that have an “edit with external editor” feature, since issuing this command on HTML files opens them with Dashcode. Dashcode of course is not the current association for opening these files (Safari is), so it's clearly looking for apps that can edit HTML. Since I'd like to use TextMate as my editor in these cases, how can I set this preference?

    Read the article

  • Box.com file sharing - How are you managing concurrent document access and file locks? [closed]

    - by Matt
    My company is evaluating Box.com as a file server replacement. It's file locking behavior for concurrent access to files seems incomplete. Specifically, files are not locked* (either exclusive or read-only) when they are being edited by Office or similar programs. This inevitably results in multiple versions of documents as concurrent access results in change conflicts. *The exception is when the file is edited using Zoho Docs - perhaps other web-based office suites as well. Box provides multiple options for editing documents, including Google Docs, a local copy of Office or similar, Zoho Docs and others. If you are using Box how have you managed or worked around this behavior?

    Read the article

  • How to track down a file descriptor leak?

    - by cclark
    I have a java process (Glassfish) which is leaking file descriptors. I know this because I get the helpful java.io.IOException: Too many open files exception. I can look in /proc/PID#/fd and see all the open file descriptors. When I use lsof I get a very large number of entries like this: java 18510 root 8811u sock 0,4 1576079 can't identify protocol java 18510 root 8812u sock 0,4 1576111 can't identify protocol java 18510 root 8813u sock 0,4 1576150 can't identify protocol I see 12 new ones created per minute. What options can I use on lsof or what other tools are available to me to help track down socket file descriptors where the protocol can't be identified? thanks, chuck

    Read the article

  • Replace the broken file copying UI in Windows 2008 Server 64-bit Explorer

    - by cbp
    Does anyone know a good GUI alternative for file copying on a Windows 2008 Server 64 bit edition. The built-in GUI has a hopeless interface and is bug-riddled which really hinders the ability to get things done safely. For example, often when moving a directory with subfolders, the directory and its subfolders will still remain, empty and not deleted. I've been through many of the common file copier and Windows Explorer alternatives, but either they flat-out do not work on a 64 bit/W2k8 machine or they do not actually fully replace the file copier.

    Read the article

  • Keyboard Navigation of File Open Dialog in Windows 7

    - by dkusleika
    In Windows XP standard File - Open dialog, the top has a "Look In" box. I can press Alt+I to drop down a tree of the disks folders and easily navigate to other folders or network shares. In Windows 7, I can't seem to navigate the File - Open dialog as easily. The best I've been able to muster is to tab 5 times (in Excel 2007, but I assume it's a windows standard), then use arrow keys or use Alt+arrow keys like a browser to get around. It's simply not as good because I can't see the whole tree at once. Is there a way to see the whole folder tree? If not, do you have any other tips for keyboard navigation of the file open dialog in Windows 7?

    Read the article

  • MS Word reports files read-only on Win Server 2003 file server

    - by Larry Hamelin
    I'm not a sysadmin, but I play one on TV: I'm trying to fix a problem for my mom's tiny non-profit company's server. I set up a Windows Server 2003 machine as a domain controller and file server. Everything has been working well for a few months, but lately when she tries to save changes to a Word (Office XP) document stored on the server, Word will intermittently report that the file is read-only. Saving to an alternate file in the same directory works, and when she closes Word and re-opens the original document, it'll save changes just fine. No one else ever has these files open. I've checked security and share permissions, and everything's OK. We've tried rebooting the server, but the problem continues, but intermittently. I have no clue what's going on. Help!

    Read the article

  • 403 Forbidden when trying to download file that was uploaded using SSH

    - by Simon Hartcher
    I have FTP access to an Apache server on linux to upload files so that they can be downloadable from the web. I recently was granted SSH access for extra permissions and figured that it would be quicker to download the files directly to the server, instead of downloading them to my machine then FTPing to the server. When I downloaded a file using SSH to the server, and then placed it in the public_html directory, it was not visible from the web. The permissions (from SSH and the FTP client) were the same as all the other files that are visible, but it was not visible in the directory listing, and if I tried to type in the filename into my browser I would get a 403 error. Obviously, when I FTP a file to the server something else happens that makes it web visible, that I am not currently privy to. What am I missing that is causing the file to be invisible from the web?

    Read the article

  • How to recover unsaved PSD file on MacOSX

    - by cenk
    Adobe Photoshop creates temporary *.psb files for emergency recovery at this path: ~/Library/Application Support/Adobe/Adobe Photoshop CS6/AutoRecover The files created have names like _Untitled-10FDB62ECBABBFF5C8EAD958EBC9CFAE2E.psb with current user:group as designated owner. If you save the file you are working on OR you hit "don't save" when prompted, the temporary files are deleted. Now, system creates and deletes these files. I am trying to recover the emergency file but I think the "undelete" utilities were created assuming the "user" deletes the file - like going into the trash bin and then emptying the trash... Anyone having experience about this? Thanks.

    Read the article

  • Old hard drive file permissions still there

    - by blsub6
    I have a new hard drive, put Windows 7 on it and want to get all the files off of my old hard drive. I put in my old hard drive as a slave drive. I can see the files but when I try to move 'em, it tells me that I'm not the owner of the file. I try to take ownership of the file and it doesn't work (it doesn't tell me that I can't take ownership of it, it goes through, just gives me the same error when I try and open the file again). I've tried modding the permissions, no dice. Anything else I can try?

    Read the article

  • Fixing mac user file permissions, not the system

    - by Cawas
    Usually those files get wrong permission when coming from the network, even when I copy them from it, but mostly through "file sharing". So, definitely not talking about Disk Utility repair here, please. But regardless of how the file got wrong permission, I know of two bad ways to fix them. One is CMD+I and the other is chown / chmod. The command line isn't all bad but isn't practical either. Some times it's just 1 file I need to repair, sometimes it's a bunch of them. By "repair" I mean 644 for files, 755 for folders, and current user:group for all of them. Isn't there any app / script / automator out there to do that?

    Read the article

  • File apparently doesn't exist when attempting to delete it

    - by Alex Yan
    A month or so back, I untarred the Linux source in a folder in Cygwin (I was curious as to whether or not it would compile with MinGW 'cause my other computer running Linux is a slow single core Sempron). I tried deleting it, but there's 1 file left, and it will not delete... Cygwin resides in C:\cygwin, and I untarred the source in C:\cygwin\src\linux-3.7.1. It didn't compile... So I tried deleting the folder. It was going well, until at the end, when I realized not all files are deleted. I tried deleting linux-3.7.1 folder again, and an error popped up: I opened the folder, and found that there's 1 source file left: aux.c, which is in C:\cygwin\src\linux-3.7.1\drivers\gpu\drm\nouveau\core\subdev\i2c\aux.c. It will not: Delete Open Move General properties: Security properties: How do I remove this file?

    Read the article

  • Recover open but deleted file on Linux using ln instead of cp

    - by Yang
    Say I have a file that's downloading (from a source that's hard to re-download from), but accidentally deleted from the filesystem namespace (/tmp/blah), and I'd like to recover this file. Normally I could just cp /proc/$PID/fd/$FD /tmp/blah, but in this case that would only get me a partial snapshot, since the file is still downloading. Furthermore, once the download completes, the downloading process (e.g. Chrome) will close the FD. Any way to recover by inode/create a hard link? Any other solutions? If it makes any difference, I'm mainly concerned with ext4. Thanks in advance.

    Read the article

  • how to run an AFS file server on a specific ethernet card (in Debian)

    - by listboss
    I have a linux box running Debian server with minimal number of packages (so no GUI for network management). The box has two ethernet cards, one of which (eth0) is connected to a Mac OSX computer using a cross-cable. I can bring up eth0 and assign a static ip (10.10.11.16) to it. This way I can ssh to the box through the cross-cable. This is what I run on Linux box: ifconfig eth0 10.10.11.16 netmask 255.255.255.0 up I also installed/started a file server (AFS) on Debian. So far, the file server can only be accessed through eth1 which is exposed to my home LAN and www. My goal is to set up the file server so that it's only visible through eth0. Is this possible? and if yes, how can I do it?

    Read the article

  • Why is my hosts file not working?

    - by elliot100
    I've been using the hosts file to for local website development, and it's recently stopped working. No entries other than localhost resolve. I've simplified to test, so it now contains only 127.0.0.1 localhost ::1 localhost 127.0.0.1 test.dev localhost responds to ping, test.dev does not. The file is called hosts with no extension It has no trailing spaces It's saved in C:\WINDOWS\System32\drivers\etc which matches the value of HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters\DataBasePath Oddly, despite UAC being on, I can edit, delete and save the file without admin permissions No proxy is being used, PC is not connected to network for testing Stopping the DNS Client service seemed to resolve the issue for a few minutes, test.dev briefly resolved but doesn't any more. Only firewall is Windows' Machine has been restarted. Is there anything else I should try?

    Read the article

  • How to download a url as a file?

    - by Michelle
    A website url has "hidden" some mp3 files by embedding them as shockwave files, as follows: <span class="caption"><!-- Odeo player --><embed src="http://odeo.com/flash/audio_player_tiny_gray.swf"quality="high" name="audio_player_tiny_gray" align="middle" allowScriptAccess="always" wmode="transparent" type="application/x-shockwave-flash" flashvars="valid_sample_rate=true external_url=http://podcast.cbc.ca/mp3/sundayeditionstream_20081125_9524.mp3" pluginspage="http://www.macromedia.com/go/getflashplayer"></embed></span> How can I download the files for off-line listening? I've found two methods: 1. The StackOverflow Method Create a new local html file with just the links eg <a href="http://podcast.cbc.ca/mp3/sundayeditionstream_20081125_9524.mp3">Sunday Edition 25Nov2008</a> Open the file in the browser, right click the link and File Save Link As. 2. The SuperUser Method Install the Firefox addin Iget. (Be sure to use the right version for your Firefox version.) Tools Downloads Enter url in field. Are there any other ways?

    Read the article

  • Linux - File was deleted and then reappeared when folder was zipped

    - by davee9
    Hello, I am using Backtrack 4 Final, which is a Linux distro that is Ubuntu based. I had a directory that contained around 5 files. I deleted one of the files, which sent it to the trash. I then zipped the directory up (now containing 4 files), using this command: zip -r directory.zip directory/ When I then unzipped directory.zip, the file I deleted was in there again. I couldn't believe this, so I zipped up the directory again, and the file reappeared again but this time could not be opened because the operating system said it didn't exist or something. I don't remember the exact error, and I cannot make this happen again. Would anyone happen to know why a file that was deleted from a directory would reappear in that directory after it was zipped up? Thank you.

    Read the article

  • Opening NBF backup file?

    - by ellisgeek
    I have a backup file from before i reinstalled windows but am unable to open it because the file is a NBF. It was created with Acer Backup Manager which is a proprietary version of NTI's backup software. is there any way to open this? I have tried using NTI Backup Now! 4.x but it says the file is invalid. Acer Backup Manager will only let me restore the ENTIRE image (not what I want), and many hours of googling have left me empty handed.

    Read the article

  • Batch edit (not rename) file properties in windows

    - by Jay
    I have a large directory of downloaded shareware. I keep track of what i have by individually editing the properties of each program. However, some of the programs are multipart .rar types. And i have at least a few hundred programs so far. I am looking for a utility that will let me batch edit file properties such as Title, Author, Summary, and Comments, so I don't have to edit each file or file part individually. Windows doesn't let me do this in Explorer. Powerdesk has a proprietary system, but it isn't preserved when moving or copying files. Any Suggestions?

    Read the article

  • nginx static file buffer

    - by Philip
    I have a nfs which several frontend-servers are connected to for making the files stored on the nfs available for http downloads. It looks like I have problems with the way apache is serving the files, there seems to be a very small buffer or no buffer at all which results in a lot disk seeks. I did some testing with loading the whole requested file into memory at once and serve it to the client from memory. With this technique I need less disk seeks for a download stream. Since I don't want to implement this by myself for production use I thought that I could maybe use nginx for that because the documentation says that it uses buffers for static file serving. Is it possible to increase the buffer size to a few mb, if so which config parameter do I have to change for this? Has anyone experience with large buffers for static file serving? Is there a better way to reduce disk seeks?

    Read the article

< Previous Page | 21 22 23 24 25 26 27 28 29 30 31 32  | Next Page >