Search Results

Search found 32130 results on 1286 pages for 'local search'.

Page 453/1286 | < Previous Page | 449 450 451 452 453 454 455 456 457 458 459 460  | Next Page >

  • De-duplicate Firefox bookmarks

    - by Zoredache
    What methods exist to de-duplicate Firefox bookmarks. As I search Google I find that there previously was a plugin called CheckPlaces, but that no longer seems to exist. Another popular suggestion seems to be AM-DeadLink, which I tried, but it completely trashed my bookmarks. (Fortunately I had a backup first, and yes I had closed Firefox first as instructed). I was trying to move all my youtube.com bookmarks into a folder. I tried doing a search, and then dragging the bookmarks into the folder. Apparently this creates a copy, instead of moving them as I expected. So now I have 3 of everything since I had tried a couple times.

    Read the article

  • how to compare files/directories of 2 separate solaris boxes ?

    - by chz
    Hi Friends I have 2 solaris boxes and I need to check certain directories (on local filesystem and mounted nfs) to make sure that they match up on both boxes and to delete or move the other mismatches to elsewhere on the local filesystem. I investigated for unix commands like rsync, and tree but it appears that these commands are not supported on my Solaris boxes. What is the best approach to this problem with the least pain to solve it ? to use rsync, tree and then diff the outputs or find ? I have trouble limiting the find command to certain directories as there are mounted folders that contain too many xml files that I don't care to much in that directory. What's the find command to search multiple directory paths on a single find command. Thanks Sincerely

    Read the article

  • Software for Company internal Website [closed]

    - by LordT
    hope this is the right stackexchange site to ask this: We've a group of webpages/services at work (SE Startup), ranging from SVN, trac, continous integration to link collections to a DMS. Nearly everything has an RSS Feed to get the info I need, with the exception of SVN. I'm looking for some kind of software that can integrate these well on a kind of start-page. The most recent changes, upcoming events etc should be clearly visible, as well as an option to search (the search will be provided from a different tool). A news area should be included as well. Currently, I'm pondering doing this with either wordpress or TWiki, although wordpress seems to be the simpler solution in terms of getting something good looking quickly. Authentication should be handled by HTTP-Basic Auth, which we already have in place and working well. I normally would consider Sharepoint a viable option for this, but we're exclusively mac and linux, I won't put up a windows server just for this.

    Read the article

  • Reliance on Outlook (been a looong time, I know)

    - by AndyScott
    Do you feel that your development group too reliant on Outlook? Have you reached a point that you have to search your email for pertinent information when asked? What are you using? I realized things had gotten out of hand a couple weeks ago over a weekend. I was at my in-laws house (in the country, no PC/laptop, no internet connection; and I get an email on my phone that I needed to reply to, but I couldn't send without deleting items from my inbox/sent items/etc. Now mind you, I have rules set up to move stuff into folders, and files more than a month old are automatically moved to the PST; but generally don't manually move items to a PST until I have had a chance to 'work' the item. Please don't bother mocking my process, it's just the way I work. That being said, it was a frustrating process of 'I need all this information, what can I afford to lose'. I work on an International project (think lots of customers), and conversations in 9 or 10 different directions about 10-20 different things are not abnormal for a given day. I have found myself looking data up in Outlook because that's where it is. I think that I have reached the point now, where I don't feel that Outlook is up to the task of organizing the data that it contains.   When you have that many emails (200 or so a day), information seems to get lost at times, and I find that Outlook's search capabilities are lacking. Additionally, I find that any sort of organizational 'system' of sorting emails that can cover multiple topics is a lost cause. But at the same time, the old process of taking the information that I got from emails and moving it into another 'notes' type of program has proved to be too time consuming. Anyone out there have some better type of system? (Comments about the capacity of my brain, and it's ability to recall information not needed.)

    Read the article

  • 1080p HD TV + what is minimum spec pc required to stream HD movie files to it?

    - by rutherford
    I want to stream hi-def (non flash-based) movies from my future minimum spec pc to my network-ready HDTV. What I want to know is a) when streaming from a computer (local wifi network), is the computer's cpu/video/ram resources used to the same extent as it would be if playing back on the computers local screen? If not what are the differences? b) So with streaming hd content what is the minimum spec processor I should go for, if i) only one TV is acting as client ii) two TVs are simultaneous clients.

    Read the article

  • "The requested operation could not be completed due to a file system limitation" 3202

    - by user46529
    I backup SQL Server database and it fails BACKUP DATABASE dd TO DISK = '\backupServer\backups\dd.bak' WITH COMPRESSION, CHECKSUM, NOFORMAT, INIT , BlockSize = 65536 , BufferCount = 2200 , MaxTransferSize = 4194304 The backup size is 3TB and I have 6TB free space on bacup server. I am using backup parameters per SQLCAT whitepaper. Everything works ok when I backup to local HDD and it always fails when I backup to network share. After about 6 hours. Can't find why. Thank you. Yes. The backup over the network is fastest and saves me 3Tb of local disk space :) Thanks for pointing to the memory issue. I left 4Gb to OS and it worked!

    Read the article

  • How to forbid postfix to send to external domains [closed]

    - by elhoim
    I have a local postfix server, and i want it to only relay emails to the only local domain (localdomain.be): myhostname = localdomain.be mydomain = localdomain.be alias_maps = hash:/etc/aliases alias_database = hash:/etc/aliases myorigin = $myhostname mydestination = $myhostname relay_domains = $mydomain default_transport = smtp relayhost = mynetworks = 127.0.0.0/8 [::ffff:127.0.0.0]/104 [::1]/128 10.0.0.0/24 mailbox_size_limit = 64000000 message_size_limit = 1000000 recipient_delimiter = + inet_interfaces = all inet_protocols = all smtp_host_lookup = native This configuration works fine to allow relay mail locally and on external destination domains, but i would like it to be an impossibility to send to other domains (ie: gmail.com). relay_domains is supposed to ensure that but it does not seem to really filter since i can still send to my gmail address.

    Read the article

  • Remote sessions limited to two of many monitors?

    - by Xaephen
    I have 3 monitors attached to my local PC arranged in a triangle. My goal is to have the top monitor display local content, while displaying a remote session across the bottom two. I have gotten this to work with span and RDP, but I am looking for a software that would allow the remote sessions to be restricted to a number of monitors of my choosing, rather than spanning. Does this software exist? I've looked extensively. If there is a hack or something that might do it, I'm totally down for getting my hands dirty.

    Read the article

  • MySql transfer / update (a bit specific)

    - by Jeff
    before posting I was digging whole site but didn't find help for my problem, so I hope someone will help... Facts: 30 Gb mysql database on remote server (about 20.000.000 rows) data are once weekly updated in local network (mysql) I need to transfer/replace local updated database with remote connection is about 2mb (real mb, not mbps) up/down Point is that I can't have 'down time' of remote mysql server. Until now I Tried: navicat data sync - Ok, but take about 3 days to finish dbForge - ok but need 5 days to finish mysql dump transfer to remote server and execution - about day, but a lot of downtime rsync folder with database /mysql/lib/MY_DATABASE - 4 hours, but after that I need to execute always 'repir on remote server' which takes about 2 hours, and a lot of down time mysql dump piped from cl to directly goto server - still now satisfied many problems I could give you more things that I tried... mysql replication - slow Anyase, what is best,best way to: refresh remote mysql on weekly level and in same time to have 0 sec down time nor huge server load If you have any idea please share

    Read the article

  • Generic Pop and Push for List<T>

    - by Bil Simser
    Here's a little snippet I use to extend a generic List class to have similar capabilites to the Stack class. The Stack<T> class is great but it lives in its own world under System.Object. Wouldn't it be nice to have a List<T> that could do the same? Here's the code: .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: Consolas, "Courier New", Courier, Monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } 1: public static class ExtensionMethods 2: { 3: public static T Pop<T>(this List<T> theList) 4: { 5: var local = theList[theList.Count - 1]; 6: theList.RemoveAt(theList.Count - 1); 7: return local; 8: } 9:   10: public static void Push<T>(this List<T> theList, T item) 11: { 12: theList.Add(item); 13: } 14: } It's a simple extension but I've found it useful, hopefully you will too! Enjoy.

    Read the article

  • How to bridge two networks via VPN (IPsec)?

    - by polemon
    I'd like to do a Site-to-Site bridging with VPN (IPsec), how do I do that? On the local side, I have a DrayTec Vigor2910, it is supposed to be able to manage IPsec tunnels. Anyway, I need to have several VPN tunnels to various sites, but how exactly do I do that, If the only router I can configure, is the local one? As I understand it, I'd need some sort of VPN server or client, or whatever on the other side. In any event, please clarify that issue. Thanks.

    Read the article

  • Moving a file using PuTTY

    - by Paul Trotter
    I am newbie struggling to move a file on a Linux VPS using PuTTY. I can log in with a user in PuTTY at this point I can navigate to see the file I wish to move (~/servers/apache-solr-3.6.2/example/webapps/solr.war). By using cd .. a couple of times from the directory I begin at when I first log in to PuTTY I can then navigate to the location I wish to move the file to: usr/local/jakarta/apache-tomcat-5.5.36/webapps/ I know that I need to use cp to copy the file and have tried variations on: cp ~/servers/apache-solr-3.6.2/example/webapps/solr.war usr/local/jakarta/apache-tomcat-5.5.36/webapps However each time I get 'No such file or directory' I have tried excluding the ~/ and the start and I have tried specifying solr.war at the end of the command. Please excuse the newbie question, but I would really appreciate some advice on what I am doing wrong here.

    Read the article

  • Why can't email clients create rules for moving dates like "yesterday"?

    - by Morgan
    I've never seen an email client that I could easily create a rule to do something like "Move messages from yesterday to a folder?" Is there some esoteric reason why this would be difficult? I know I can easily create rules around specific dates, but that isn't the same thing by a long shot; am I missing something? In Outlook 2010 I can create search folders that do sort of this type of thing, but you can't create rules around a search folder... seems like either I am missing something major, or this is terribly short-sided.

    Read the article

  • How does hadoop decide what its nodes hostnames are?

    - by Dan R
    Currently the urls generated by the jobtracker & namenode return either hostnames like bubbles.local or just bubbles. These end up not resolving unless the client machine has specified these in their /etc/hosts file. When I run the hostname command on these machines it returns a hostname complete with the domain (E.G bubbles.example.com) Running a small java test on these machines InetAddress addr = InetAddress.getLocalHost(); byte[] ipAddr = addr.getAddress(); String hostname = addr.getHostName(); System.out.println(hostname); Produces output just like the hostname command. Where else could hadoop be grabbing a hostname to use in its jobtracker / namenode UI? This is occurring in clusters with Hadoop 1.0.3 and 1.0.4-SNAPSHOT from early august. The machines are running CentOS release 5.8 (Final). The generated URLs I'm referring to are like this http://example:50075/browseDirectory.jsp?namenodeInfoPort=50070&dir=/ or http://example.local:50075/browseDirectory.jsp?namenodeInfoPort=50070&dir=/

    Read the article

  • apache port number

    - by user983223
    For each development sites I want to have a unique port number. For instance, domain.com:1234 This is what I have in my httpd.conf file. After restart the page domain.com:1234 is not showing in the browser. Is there anything else that I need to do besides what I have already done to make this work? Listen *:1234 <VirtualHost *:1234> DocumentRoot /var/www/dev_sites/test ServerName domain.com:1234 </VirtualHost> It looks like if I go to my local hostname (kk.local:1234) it shows. Is there some sort of dns that I need to do? I really don't want to go into godaddy everytime I add a development site. Is there a way around that?

    Read the article

  • Accessing to shared folders with OpenVPN

    - by Ergec
    This is my first attempt to configure a vpn so I have very little knowledge about this. Network where centos server is having local IPs 192.168.123.* Network where windows machine is having local IPs 192.168.1.* I installed and configured my openpvn server on centos 5 and client on a windows machine. Generated all keys, certificates e.t.c and transferred them to client and I'm able to connect to server. Below there is a screenshot of the client log. Also on server side I can also see incoming packets with this command tcpdump -n port 1723 So I assume I did most of the things correct. But still when I try to open shared folders using \\192.168.123.33 or \\network-name I can't access folders

    Read the article

  • Is there a way to disable beeps in chm browser and what's causing them?

    - by iuiui
    When browsing a .chm file, in search tab, when I click on each result I get a beep. I don't think there should be a beep every time I click on a search result. In any case, I'd like to just disable them. I know there is a way to disable the beeps system-wide, but I don't require that. It's not the "Start navigation" sound like used in Internet Explorer. This is truly a beep, like the one resulting when the pc is not responding, and one keeps pressing various keys on the keyboard, with each key press generating a beep. Besides, I have the windows "no sounds" scheme, meaning there shouldn't be event driven sounds.

    Read the article

  • ssh asks for password despite ssh-copy-id

    - by Aliud Alius
    I've been using public key authentication on a remote server for some time now for remote shell use as well as for sshfs mounts. After forcing a umount of my sshfs directory, I noticed that ssh began to prompt me for a password. I tried purging the remote .ssh/authorized_keys from any mention the local machine, and I cleaned the local machine from references to the remote machine. I then repeated my ssh-copy-id, it prompted me for a password, and returned normally. But lo and behold, when I ssh to the remote server I am still prompted for a password. I'm a little confused as to what the issue could be, any suggestions?

    Read the article

  • Mac Snow Leopard Server DNS

    - by panomedia
    I have a Tomcat-driven application on my Windows server that I am planning to move to a MacMiniServer. Before I do this, I want to fully test the transition for licensing purposes. I have a Fire drive setup with Snow Leopard Server and the base app runs just fine BUT I need to be able to resolve the URL to my domain and not localhost. So, I figured I would setup panomedia.net in the DNS Server and also create an A record to my internal network IP so www.panomedia.net would dish out the same thing as localhost. The problem is: The Tomcat web app starts up going through panomedia.local and not through www.panomedia.net and My main network preference panel is still looking at my Comcast DNS search providers even though I put my local IP address as the only DNS Server and Search provider. I need to test this via an actual domain name before I commit to a 400GB data move. Can anyone help?

    Read the article

  • Searching Excel sheet for errors

    - by Graphth
    Imagine a huge worksheet with tens of thousands of formulas. I want to be able to quickly find all the errors to correct them. I have found that using the normal search procedure I can type in things like #DIV/0! or #NAME? and it will find them, but I would have to type in all the various types of errors separately and that is somewhat time consuming. Is there a way to simply search for any error? One solution we seem to use at work is to put most formulas inside =if(iserror()) or now =iferror() and to just have it output "error" if it is an error. Is this necessary? Or, is there a way to find all the errors without it?

    Read the article

  • commit/update/merge commands in svn

    - by ajsie
    i want to know exactly when i should use either of commit, update and merge command in svn. after i've checked out a project and altered the code, should i use update, commit or merge to stay in sync? correct me if im wrong: update = all changes in the repo is copied to your local project. commit = all changes in your local project is copied to the repo. merge = same as above, but you determine the direction? when do i use each command above?

    Read the article

  • need a different backup solution

    - by DigitalJedi
    I just built a new media/backup server using Ubuntu 12.04 64bit. I installed a hard drive to be used only for music, pictures, and videos and formatted it fat32 so my 1 and only Windows PC could map those folders as netshares. My laptop, also running Ubuntu 12.04, is what I am using the most so new media is first downloaded on my laptop. I've already got the music, videos, and pictures folders from my server mounting as shares on my laptop on boot thanks to some fstab edits and sshfs. Now I'm wanting either an app or script that could backup any new files I add to my local media folders to the mounted folders on my server. I've been Googling all day and found a few apps like rsync but they seem to have issues with ext4 to vfat backups. I thought maybe a script would be best but I'm new to scripting in Linux and don't want to mess anything up. Basically I am looking for something that will backup only newly added files to the server. I figure I could schedule it once a week. There are some stipulations. For example, my local music folder has over 700 folders for each artist/band then sub folders inside those for albums. I want something smart enough to only copy newly added content so I'm guessing the modified date would probably be a good condition if I were scripting. I'm rambling. Any suggestions would be GREATLY appreciated. I'm not finding anything to suit my needs. I'm almost to the point of just learning bas scripting so I can write something but then it will be a couple weeks or so before I have a possible solution and I'd like something in place sooner.

    Read the article

  • Adding custom script on ESXi 5.0

    - by Quzar
    I have an ESXi server that I would like to have run a custom script on every boot that contains esxcli and other commands. I have tried adding the script into init.d and creating an rc.local.d folder with a script, but the etc folder gets rebuilt on startup. I've also tried modifying state.tgz and local.tgz in the /bootbank folder in order to force these files to appear, but that does not seem to work either. Is there any way I can run custom commands on boot? Note: I've tried the advice here ESXi boot process / state storage to no avail. Seems the system was changed between 4.1 and 5.0

    Read the article

  • Why can't we reach some (but not all) external web service via VPN connection?

    - by Paul Haldane
    At work (UK university) we use a set of Windows servers running WS2008R2 and RRAS which offer VPN service to students in our accommodation. We do this to associate the network connections with individuals. Before they've connected to the VPN all they can talk to is the stuff thats needed to setup the VPN and a local web site with documentation on how to connect. Medium term we'll probably replace this but it's what we're using at the moment. VPN on the 2008 servers allocates client a private (10.x) address. Access to external sites is through NAT on the campus routers (same as any other directly connected client on a private address). Non-VPN connections aren't seeing this problem. Older servers run WS 2003 and ISA2004. That setup works but has become unreliable under load. Big difference there was that we were allocating non-RFC1918 addresses to the clients (so no NAT required). Behaviour we're seeing is that once connected to the VPN, clients can reach local web sites (that is sites on the campus network) but only some external sites. It seems (but this may be chance) that the sites we can reach are Google ones (including YouTube). We certainly have trouble reaching Microsoft's Office 365 service (which is a pain because that's where mail for most of our students is). One odd bit of behaviour is that clients can fetch (using wget on a Windows 7 client) http://www.oracle.com/ (which gets a 301 redirect) but hangs when asked to fetch http://www.oracle.com/index.html (which is what the first URL redirects to). Access works reliably if we configure clients to use our local web proxies (Squid). My gut tells me that this is likely to be something in the chain dropping replies either based on HTTP inspection or the IP address in the reply. However I'm puzzled about why we're seeing this with the VPN clients. Plan for tomorrow (when I'm back in the office) is to setup a web server on external connection so that we can monitor behaviour at both ends of the conversation (hoping that the problem manifests itself with our test server). Any suggestions for things we should be looking at?

    Read the article

  • Cross-platform centralized desktop password manager

    - by Dave
    I have been using KeePass as a desktop password manager on Windows for many years. Love it! However, I am now needing to work on different platforms much of my day (Windows 7, Windows XP, Mac OS X, Ubuntu, and OpenSUSE.) I'm looking for a password manager I can share across all these platforms. My ideal solution would: Run natively (not in a virtual machine) on all platforms. Store the "official" copy of the password data on a local network so I can get to it from any and all machines. It is OK if it locks (or becomes read-only) when one client is accessing it. Keep a local cached copy (read-only is fine) so I can still get to my passwords when disconnected from the network. Does any such beast exist?

    Read the article

< Previous Page | 449 450 451 452 453 454 455 456 457 458 459 460  | Next Page >