Search Results

Search found 5435 results on 218 pages for 'ones and zeroes'.

Page 86/218 | < Previous Page | 82 83 84 85 86 87 88 89 90 91 92 93  | Next Page >

  • Apt-Get "sources.list needs at least 1 source-URI" error when building dependecies for Xen

    - by Entity_Razer
    What i'm trying to do is install Xen in a test environment, now I am trying to run the: apt-get build-dep xen-3.3 command, but it keep throwing a error which literally translated from dutch (installed the debian OS in Dutch) say's: E: your sourcelist (/etc/apt/sources.list) has to contain at least 1 source-URI I've googled it but I can't seem to find a definitive solid answer on how to fix this. By default a source-URI (read man page of apt-get) states it needs to be something along the lines of deb ftp://ftp.debian.org/debian stable contrib Now I've got 2 HTTP sources (default Debian ones) up & running so far and they've been working flawlessly for the better part of a few days now. Only now its starting to act up. Anyone able to help me out ? Much obliged !

    Read the article

  • Very High Network out in ec2 instance

    - by Jatin
    I launched an ubuntu-14.04-64bit instance in Amazon EC2 two days back. And I started Tomcat 7.0.54 in that instance and deployed my application war files. It has no other software installed other than tomcat and the default ones. In the past 2 days, its shows 858 GB of Data Transfer(Network Out) from that instance. I have attached a graph of Amazon CloudWatch Metric "Network Out" My application does not do any data download/upload. Its a Java Spring application and the front end is in HTML&Javascript. My application traffic was very low (less than 20 hits) in those 2 days. Is there a way to find out why these data transfers happened and also to find what data has been transferred. If you can see in graph, network out was 20gb per minute. Some more info: Network in was negligible CPU Utilization was very high Everything else was low

    Read the article

  • cant access site intermittent as i get site not available

    - by user61438
    i have a news site i check stuff for regularly.. once in a while i cant access it completely from my 5 different browsers and even from the command line, nslookup cant resolve, ping and tracert fails.. all not being able to resolve site name..the problem is specific to this.. i am told by the site support person that problem is that its cache on my machine or on ISP and I say thats not true because this site I access it 1-2-3 times a week from one browser only.. i only fire up the other ones in times of trouble..and when i have the problem even if i try on my secondary machines i still get the same problems.. when problem goes away all works well.... question is whats causing the intermittent dns/http problems... i don't believe this is something to do with my machine because probe disappears without me having done anything at all.

    Read the article

  • Wiki on a pendrive - should work with any OS

    - by Florian Pilz
    Hello, I want to have an "extended memory" and want to accomplish that with an wiki on an pendrive. I already decided to use PmWiki, but if another wiki solves my problem that would be fine. The issue is: If I install an Apache on an pendrive, it depends on the running operating system. Because I'm using Windows & Linux (and will use Mac in the future) it is crucial for me to be platform independent. I read this article. DokuWiki is for Windows only and MoinMoin needs Python installed (which would be possible on my PCs, but not on public ones). Every help for my "extended memory" is appreciated. PS: As a last resort I could host a wiki on my webpage, which would be accessible everywhere. But I just see a challenge in trying it on a pendrive.

    Read the article

  • Transparently cache files from a network drive in Linux

    - by Vadim
    We have a Linux server that reads files from a network drive and processes them. In a common scenario, a user will log in and access the same files over and over again. The size of the files varies but the larger ones can be around 50+ Mb. The files seldom change. I was wondering if it's somehow possible to transparently cache the files. I don't want (or can) change the program the reads the files, nor do I control the protocol by which the files are accessed. I just want something to detect that I access a certain path, copy the file locally (if needed) and then read the file from the local drive. I've read about Bcache but can't figure out if it's what I need. Do you have any suggestions? Thanks, Vadim.

    Read the article

  • What features do you want to see in Windows Server?

    - by Ben Herila
    I work for the Windows Server planning and management team at Microsoft. While I can't share any of our current plans right now, I am interested in seeing what features people would like to see in the next release of Windows Server. Please post your ideas as separate answers, and vote for the ones you want to see. If there is an overwhelming response to an idea or two, I'll personally see what I can do to get the feature included. I know this question is subjective, but I do believe this thread has relevance to the community. Looking forward to hearing from you!

    Read the article

  • How to properly store dotfiles in a centralized git repository

    - by asmeurer
    I'd like to put all my dotfiles (like .profile, .gitconfig, etc.) in a central git repository, so I can more easily keep track of the changes. I did this, but I would like to know how to properly handle keeping them in sync with the actual ones in ~/. I thought that you could just hard link the two using ln, but this does not seem to work as I expected, i.e., if I edit one file, the other does not change. Maybe I misused the ln command, or else I misunderstand how hard links work. How do people usually do this? Judging by GitHub, it's a pretty popular thing to do, so surely there's a seamless way to do it that someone has come up with. By the way, I'm on Mac OS X 10.6.

    Read the article

  • What causes Windows Boot to stall?

    - by Nick Berardi
    For about 6 months now I have been having this weird problem where Windows 7 fails to fully boot correctly. What happens is this. Starting Windows shows up on the screen. Then 3 out of 4 times nothing else happens, no Windows Flag animation, just nothing occurs. After 3 or 4 restarts repeating steps 1-2 above, the Windows Flag animation finally shows up and everything works as expected. My question is what is causing this problem in steps 1 and 2? Because I have tried the following with no luck: Error checking and correcting of any disk errors Updating drivers Doing a clean install of Windows 7 My setup is as follows: Windows 7 64-bit Ultimate 8 GB RAM 128 GB Crucial SSD (firmware 0005) Dell Latitude E6410 Intel Wireless and Graphics Other than what I have tried above I am totally out of ideas and looking for some new ones to try.

    Read the article

  • Is there a way to submit a batch of commands to a Cisco router and have them execute from the router?

    - by atroon
    I need to change the configuration of a remote (6 hours' drive) client's Cisco 871 (IOS 12.4.15T) from my location because of some new internet service at his location. To be more precise, I need to change the default route, ip address of the outside interface (Fa4) and disable the PPPoE setup there. Unfortunately, doing any of this will (obviously) break the connection to the router. I do not have an out-of-band management modem set up (I know, I know). Is there any way to enter the commands I need to have run and have them execute one after the other, from a file on flash:? I have never tried anything like that before. Essentially a DOS-style batch file is exactly what I need. Nothing like it seems to be out there except using kron to execute CLI commands, but that is specified here as only taking EXEC commands, not configuration ones. Is there hope, or do I travel?

    Read the article

  • Set Error-Pages for all Applications in Tomcat

    - by phisch
    I'm trying to set up custom error pages in tomcat 6, because I don't want the default ones to show up. My error pages are static html, no jsp or anything dynamic. I know how to do this through the web.xml in each application but I'd prefere to setup the error pages only once for the entire server. I tried to add the following fragment to the global web.xml (in conf), but no matter what I add under location, it does not show. <error-page> <error-code>404</error-code> <location>/404.html</location> </error-page> What do I need to do to gobally define custom error pages? Thanks!

    Read the article

  • Send notification from HTTP bot (RESTful service or whatever)

    - by Kuroki Kaze
    I have very simple bot that gathers and parses web pages. It's on a machine in network, behind NAT (so I cannot setup a web server, for example). I don't have MTA set up. The bot should notify me about changes in parsed pages (once in a hour or two, to one recipient). How can this be done? Is there any RESTful email gateways, like SMS ones? I can set up him a twitter account and use curl to post statuses/DM, but it's a very temporary bot.

    Read the article

  • I'm running Ubuntu notebook on a USB drive with 1GB persistence. Where are my XAMPP files?

    - by CDeanMartin
    I just began running Ubuntu Notebook on my 2GB USB stick, with 1GB persistence. After I installed XAMPP, I rebooted to make sure the persistence was working. It worked! My XAMPP installation worked, and it comes up when I type localhost into the browser. The sample apps work, including the ones using MySQL. But I can't find the application files! I am used to the vast labyrinth of files in Windows Explorer, and looking through 'Files and Folders' in GNOME, there seem to only be 7 or 8 folders in the entire OS? What is really going on here? I looked through them all and where is XAMPP? When I code PHP with a WAMP stack, all the .php files go in the 'WWW folder' What is the Linux equivalent of the WWW folder?

    Read the article

  • How to mount remote samba share from local host with multiple groups?

    - by Dragos
    I am using mount.cifs to mount a remote samba share (both client and server are Ubuntu server 8.04) like this: mount.cifs //sambaserver/samba /mountpath -o credentials=/path/.credentials,uid=someuser,gid=1000 $ cat .credentials username=user password=password I mounted a user from local system with username and password with mount.cifs but the problem is that the user is part of multiple groups on the remote system and with mount.cifs I can only specify one gid. Is there a way to specify all the gids that the remote user has? Is there a way to: Mount the remote samba with multiple groups on the local system? Browse the mount from 1) with the terminal since I want to pass some files from samba as arguments to local programs. Other solutions would be: nautilus sftp:// which runs through gvfs; but the newer gnome does not write to disk the ~/.gvfs anymore so I can't browse it in terminal. And the last solution would be NFS but that means that I have to synchronize the uids and gids on the local system with the ones from the server.

    Read the article

  • Creating IIS Rewrite Rules

    - by Tom Bell
    I'm having a hard time converting old .htaccess rewrite rules to new IIS ones so I was wondering if anyone could point me in the right direction. Below are some example URLs I would like rewriting. http://example.org.uk/about/ Rewrites to http://example.org.uk/about/about.html ----------- http://example.org.uk/blog/events/ Rewrites to http://example.org.uk/blog/events.html ----------- http://example.org.uk/blog/2010/11/foo-bar Rewrites to http://example.org.uk/blog/2010/11/foo-bar.html The directories and file names are generic and could be anything. Any help would be greatly appreciated.

    Read the article

  • Security measures for CentOS

    - by cappuccinodrinker
    I have been tightening up my web server security and wanted to know what else I can do. I am running CentOS 5 with these measures: - All passwords to FTP, MySQL etc are generated from grc.com/passwords.htm and microsoft.com/protect/fraud/passwords/create.aspx (for the ones which cannot be too long). - Running iptables with all ports shut off except for http mail and smtp, the important ports like FTP SSH are blocked to all except my static office IP. There is also no response to pings. - Rootkit Hunter running daily - The server is PCI compliant according to Comodo - Not running any crappy made php apps, we use Zend Framework for our stuff and do have kayako installed and keep them up to date. Can't really think of anything else I can do... I could implement a brute force measure, but I think I already have by simply changing my SSH port to a number above 10000 and blocking it off with iptables.

    Read the article

  • Getting error while running apt-get update

    - by pradeepchhetri
    I am getting the following error while running apt-get update on all of the servers. W: A error occurred during the signature verification. The repository is not updated and the previous index files will be used. Release: The following signatures couldn't be verified because the public key is not available: NO_PUBKEY 8B48AA6247928553 W: Some index files failed to download, they have been ignored, or old ones used instead. The solutions available are: To login into each of the server and run the following command to import the gpg key for that repo. sudo gpg --keyserver hkp://subkeys.pgp.net --recv-keys 8B48AA6247928553 sudo gpg --export --armor 8B48AA6247928553 | sudo apt-key add - But this requires logging into all of the individual servers and run the above two commands. I am looking for a way to correct by working on apt-repo server. Is there any way to do that.

    Read the article

  • How to extract jpegs from a video file using ffmpeg

    - by Andrew Simpson
    I am using C# and ffmpeg. In this scenario I have 279 individual jpegs and i have used ffmpeg to create a AVI file from these images on my client. CMD Line: -f image2 -r 10 -i "C:\000EC902F17F\img%05d.jpg" -s 352x288 -y "C:\1\test.avi" I then upload to my server. CMD Line: -i c:\1\1.avi c:\1\img-%05d.jpg I then extract jpegs from the AVI file. I get 265 jpegs back. Obviously ffmpeg is dropping these frames (most probable) when the avi is 1st created. Is there a way to 'force' to encode using ALL the images I have? Thanks. PS I did not specify any command line option other than the size of the video output. As far as I am aware if none are specified then ffmpeg automatically chooses the best ones?

    Read the article

  • Can this be improved? Scrubbing of dangerous html tags.

    - by chobo2
    I been finding that for something that I consider pretty import there is very little information or libraries on how to deal with this problem. I found this while searching. I really don't know all the million ways that a hacker could try to insert the dangerous tags. I have a rich html editor so I need to keep non dangerous tags but strip out bad ones. So is this script missing anything? It uses html agility pack. public string ScrubHTML(string html) { HtmlDocument doc = new HtmlDocument(); doc.LoadHtml(html); //Remove potentially harmful elements HtmlNodeCollection nc = doc.DocumentNode.SelectNodes("//script|//link|//iframe|//frameset|//frame|//applet|//object|//embed"); if (nc != null) { foreach (HtmlNode node in nc) { node.ParentNode.RemoveChild(node, false); } } //remove hrefs to java/j/vbscript URLs nc = doc.DocumentNode.SelectNodes("//a[starts-with(translate(@href, 'ABCDEFGHIJKLMNOPQRSTUVWXYZ', 'abcdefghijklmnopqrstuvwxyz'), 'javascript')]|//a[starts-with(translate(@href, 'ABCDEFGHIJKLMNOPQRSTUVWXYZ', 'abcdefghijklmnopqrstuvwxyz'), 'jscript')]|//a[starts-with(translate(@href, 'ABCDEFGHIJKLMNOPQRSTUVWXYZ', 'abcdefghijklmnopqrstuvwxyz'), 'vbscript')]"); if (nc != null) { foreach (HtmlNode node in nc) { node.SetAttributeValue("href", "#"); } } //remove img with refs to java/j/vbscript URLs nc = doc.DocumentNode.SelectNodes("//img[starts-with(translate(@src, 'ABCDEFGHIJKLMNOPQRSTUVWXYZ', 'abcdefghijklmnopqrstuvwxyz'), 'javascript')]|//img[starts-with(translate(@src, 'ABCDEFGHIJKLMNOPQRSTUVWXYZ', 'abcdefghijklmnopqrstuvwxyz'), 'jscript')]|//img[starts-with(translate(@src, 'ABCDEFGHIJKLMNOPQRSTUVWXYZ', 'abcdefghijklmnopqrstuvwxyz'), 'vbscript')]"); if (nc != null) { foreach (HtmlNode node in nc) { node.SetAttributeValue("src", "#"); } } //remove on<Event> handlers from all tags nc = doc.DocumentNode.SelectNodes("//*[@onclick or @onmouseover or @onfocus or @onblur or @onmouseout or @ondoubleclick or @onload or @onunload]"); if (nc != null) { foreach (HtmlNode node in nc) { node.Attributes.Remove("onFocus"); node.Attributes.Remove("onBlur"); node.Attributes.Remove("onClick"); node.Attributes.Remove("onMouseOver"); node.Attributes.Remove("onMouseOut"); node.Attributes.Remove("onDoubleClick"); node.Attributes.Remove("onLoad"); node.Attributes.Remove("onUnload"); } } // remove any style attributes that contain the word expression (IE evaluates this as script) nc = doc.DocumentNode.SelectNodes("//*[contains(translate(@style, 'ABCDEFGHIJKLMNOPQRSTUVWXYZ', 'abcdefghijklmnopqrstuvwxyz'), 'expression')]"); if (nc != null) { foreach (HtmlNode node in nc) { node.Attributes.Remove("stYle"); } } return doc.DocumentNode.WriteTo(); } Edit 2 people have suggested whitelisting. I actually like the idea of whitelisting but never actually did it because no one can actually tell me how to do it in C# and I can't even really find tutorials for how to do it in c#(the last time I looked. I will check it out again). How do you make a white list? Is it just a list collection? How do you actual parse out all html tags, script tags and every other tag? Once you have the tags how do you determine which ones are allowed? Compare them to you list collection? But what happens if the content is coming in and has like 100 tags and you have 50 allowed. You got to compare each of those 100 tag by 50 allowed tags. Thats quite a bit to go through and could be slow. Once you found a invalid tag how do you remove it? I don't really want to reject a whole set of text if one tag was found to be invalid. I rather remove and insert the rest. Should I be using html agility pack?

    Read the article

  • How can I keep websites from knowing where I live?

    - by D Connors
    This questions is related to issues and practicality, not security. I live in Brazil and, apparently, every single website I visit knows about it. Usually that's ok, but there are quite a few sites that don't make use of that information adequately. For instance: Bing keeps thinking that brazilian pages are way more relevant to me than american ones (which they're not). Google.com always redirects me to google.com.br. Microsoft automatically sends me to horribly translated support pages in portuguese (which would just be easier to read in english). These are just a few examples. Usually it's stuff I can live with (or work around), but some of them are just plain irritating. I have geolocation disabled in firefox, so I guess they're either getting this information from my IP or from windows itself (which I bought here). Is there a way to avoid this? Either tell them nothing or make them think I live somewhere else? Thanks

    Read the article

  • Windows 7 tips and tricks

    - by Pyrolistical
    Related Question: Which windows tweaks do you use and they actually work? Tell us your favorite Windows 7 tips and tricks. Here's some I bet you never have heard of: Win + Arrow and Win + Shift + Arrow controls window location and even moves window to next monitor if you have multiple ones Win + P controls project/multiple monitors The pinned icons on the new taskbar can be launched by Win + 1, Win + 2, etc Launch a pinned icon again by using mouse 3, meaning you can open another Firefox window by just wheel clicking the icon! From The Bumper List of Windows 7 Secrets And a few more off the top of my head: Use the favorites at the top left in Windows Explorer. Drag commonly used folders to it, its super handy You can drag the task icons in and out of hidden icons The show desktop button is now that rectangle next to the clock on the task bar What tips and tricks do you have?

    Read the article

  • Dynamic VPN tunneling technologies

    - by Adam
    Ok, so I'm asking a more specific question this time. I'm writing a paper about Cisco's DMVPN and one of the tasks I have is to make the analysis of available network solutions which use dynamic VPN tunnels. Because the paper is about DMVPN, I have to compare those solutions to it. I know there are a lot of dynamic tunneling technologies but I'm looking for ones that can be compared to DMVPN. So the question is: are there any technologies which use dynamic VPN tunnels (not necessarily using crypto) that can be compared to DMVPN? What are those technologies?

    Read the article

  • System Restore Points

    - by Ross Lordon
    Currently I am investigating how to schedule an automatic initiation of a system restore point for all of the workstations in my office. It seems that Task Scheduler already has some nice defaults (screenshots below). Even the history for this task verifies that it is running successfully. However, when I go to Recovery in the Control Panel it only lists the System Restore Points one for every previous for only 3 weeks back even if I check the show more restore points box. Why don't the additional ones appear? Would there be a better way to implement a solution, like via group policy or a script? Is there any documentation on this online? I've been having a hard time tracking anything down except how to subvert group policy disabling this feature.

    Read the article

  • broken partition possible for recovery?

    - by claw
    I was using copywipe on hirens boot cd to copy a Windows installation to a new drive. unfortunately for me, I was rushing, I set the source drive as the USB drive running hiren/copywipe to the Windows partition, I think it has destroyed the partition tables and replaced with hiren boot USB ones. disk: was NTFS 40 / 250 partitions disk: now FAT32 145 / 145 partition I have used several partition recovery tools, diskdigger to name one, they all show a recovered partition, but its the hiren stuff. any advice would be a fantastic help To all that have similar issues I recommend using TestDisk (undelete partition) software. you can get this software as part of hirens boot cd. see answer

    Read the article

  • How can I debug Cisco Firewall ASA "Dispatch Unit" very high CPU utilisation from ASDM?

    - by Andy
    I have recently had my first firewall installed so I am very new to this whole situation. I am finding that Dispatch unit is becoming overloaded and it would appear to be the reason I get serious bouts of lag on my server. The firewall has had little configuration apart from me blocking all the ports in "Access Rules" and allowing only the ones the server needs and from where it needs them. I guess what I am after is assistance with locating the issues causing "Dispatch Unit" to take up all the CPU Regards --Edit-- With ASDM statistics I found that packets inbound (peak of 70-100k/sec from <1k/sec normal), traffic inbound (peak of 40-50kbits/sec from <1kbits/sec normal) and CPU all peak at the same time so I am pretty sure it is an attack of some sort but as a beginner with ASA I am not sure how to resolve

    Read the article

  • Apache Rewrite Rules breaking each other?

    - by neezer
    I have this rule: RewriteCond %{REQUEST_URI} ^/(manhattan|queens|westchester|new-jersey|bronx|brooklyn)-apartments/.*$ RewriteCond %{REQUEST_URI} !^/guide/(.*)$ RewriteRule ^(.*)$ /home/neezer/public-html/domain.com/guide/$1 [L] Which works great on it's own. Essentially, I have a bunch of directories that have a bunch of files in them that I want to keep in the "/guide" folder, but I want them to appear at the web root for SEO reasons. This rule works, but unfortunately the original URL's still work too (with "/guide"). I want to 301 Redirect the ones with "/guide" in the URL to those without, without actually moving the files on the server. I tried adding this rule: RewriteCond %{REQUEST_URI} ^/guide/(manhattan|queens|westchester|new-jersey|bronx|brooklyn)-apartments/.*$ RewriteRule ^guide/(.*)$ http://www.domain.com/$1 [R=301,L] ... but that breaks my first rule completely. Any thoughts about what I might be doing wrong? Please let me know if you need to know anything else from me to help me with this issue.

    Read the article

< Previous Page | 82 83 84 85 86 87 88 89 90 91 92 93  | Next Page >