Search Results

Search found 27042 results on 1082 pages for 'google forms'.

Page 850/1082 | < Previous Page | 846 847 848 849 850 851 852 853 854 855 856 857  | Next Page >

  • Apache log lines contain "..."

    - by mtah
    We have a custom log line format for Apache logs which are analyzed. CustomLog "|/usr/sbin/rotatelogs -l /mnt/var/log/apache2/access-%Y%m%d%H%M%S.log 900" "%a %{%s}t \"%r\"" However, some log lines are mysteriously shortened with "..." for some reason, but how can this be? The shortest length line discovered where this occurs is 317 chars while the longest line is way over 2000 chars. "GET /exposure?sg=&ap=0x0&fv=WIN%2010,0,22,87&si=IH95VDUAVLJ0&pt=Lage%20hjemmelaget%20sengegavl%20-%20Forum%20-%20Diskusjon.no&iv=0&sd=1024x600&ct=680&tz=-120&eu=http%3A//www.diskusjon.no/index.php%3Fshowtopic%3D1011139&l...AS3&an=NO%20-%20180x500%20Pretail%20CPC&wd=1024x483&rf=http%3A//www.google.no/search%3Fhl%3Dno%26source%3Dhp%26q%3Dsengegavl+lage%26meta%3D%26aq%3D2%26aqi%3Dg10%26aql%3D%26oq%3Dsengega%26gs_rfai%3D&ui=3INYF5QAZL10&ws=0x417&ad=180x500&sa= HTTP/1.1"

    Read the article

  • Selectively routing traffic via ethernet or wifi, with proper DNS (Mac OS X 10.6)

    - by Dan
    When I'm at work, I access various intranet pages as well as the wider Internet through ethernet. However, the company LAN blocks some ports (e.g. Google Calendar). I can get to those through WiFi. So, I gave the Airport priority, and then using route add, I set up selective routing: all intranet traffic goes through the ethernet and everything else via WiFi: sudo route add 10.0.0.0/8 <intranet gateway>. However, there are a number of intranet sites that have their own DNS; i.e., hr.company.com only resolves on the intranet. The only way that I can get the DNS to work properly is to add the internal DNS server to the Airport DNS listing, however I fear that when I go elsewhere and forget, this will break things. What's the right way to get the DNS to resolve using this setup?

    Read the article

  • Mozilla nonsense. Page changes size by itself

    - by Browser Madness
    I have never intentionally changed the size of font on latest mozilla browser install on windows machine. For example Google site is now 200% size, and I did nothing to make this happen. Whats worse is it does not change back but remembers this! Similarly other sites are too small and they remember this per site. What is going on here? I mean what nonsense! How can I undo this? And for extra points who came up with this absurd behavior at mozilla? Not making this up folks. 15.0.1 Not at all clear why it changes size or how to go back to default size for these sites Acutally it just happened again while editing this entry. Icon changes and than font size is too small.

    Read the article

  • Install Composer on Ubuntu

    - by Milos
    I am trying to install composer with the command: sudo curl -s https://getcomposer.org/installer | php And I am getting this error: All settings correct for using Composer Downloading... Download failed: failed to open stream: Permission denied Downloading... Download failed: failed to open stream: Permission denied Downloading... Download failed: failed to open stream: Permission denied The download failed repeatedly, aborting. I don't know why? Do you have an idea? I tryed to google it but nothing.

    Read the article

  • Show image in cell based on cell value

    - by JoeM
    I am creating a weekly income comparison table. I want to compare A5 and A10 cells and display appropriate image in B7 cell. The image will be either UP or DOWN arrow (which reflects increase or decrease). I've created a formula which puts text up or down in the cell, but I don't know how to replace it with the up/down image. =IF(D77>D69, "up", IF(D77<D69, "down")) How can I do it? I know it's possible. Note: I am using Google Docs Excel, so please let me know if this cannot be done in it, but I have to use the real Excel installed on my laptop.

    Read the article

  • Someone used my postfix smtp (port 25) to send spam mails to me

    - by Andreas
    This week, someone started to send spam-mails through my postfix-smtp access (I verified by logging in through telnet from an arbitrary pc and sending mails with any ids myself) on my server, with recipient and target being [email protected]. Since I have a catchall and mail-fowarding to my google account, I received all those (many) mails. After a lot of configuration (I lost track of what change did what, going through dozends of topics here and over the net) that hole seems fixed. Still, what hapened? Does port 25 need to be open and accepting for my catchall to work? What configuration did I do wrong? I remember the first thing I changed (that had an effect) was the inet_interface setting in main.cf, only later to find out that if this does not say "all", my mail to mydomain.com does not get forwarded any more.

    Read the article

  • IE8 Stopped Keeping History

    - by BillP3rd
    Like the title says, apparently my IE8 has stopped keeping the history of pages I've visited. I've searched SU and Google and can't find anything that seems to describe what I'm seeing. I have IE set to retain history for 999 days (the maximum allowed): As you can see below, apart from today and last Thursday, IE appears to be oblivious to any activity more recent than three weeks ago. Clicking on either "Thursday" or "Today" reveals no recorded history, however. Very odd behavior. Finally, the history does extend back 30 weeks to when I built the computer, and there is recorded history for every week. I'd appreciate suggestions. NB. Windows 7 Ultimate, x64 (but 32-bit IE8).

    Read the article

  • Setting up Mail (Ubuntu Server 10.04)

    - by Sam
    It seems that mail on my server is functional -- kind of. I have a simple PHP mailer script that is capable of mailing any email address that is not local. e.g [email protected] However, if I try [email protected], it tells me /home/myuser/dead.letter... Saved message in /home/myuser/dead.letter and sometimes You have new mail. What does this mean? Does anyone know what I can do to make my emails actually go through? I'm using google apps for domains -- so basically gmail is hosting my domain's email and I'm not using my own servers for receiving mail. I've set up the nameservers correctly -- I wonder if it's got to do with my sending configuration (what does PHP's mail function normally send with anyway, sendmail?) or maybe it's my receiving configuration -- maybe something makes it act differently for "local" mail

    Read the article

  • Agressive Auto-Updating?

    - by MattiasK
    What do you guys think is best practice regarding auto-updating? Google Chrome for instance seems to auto-update itself as soon as it get's a chance without asking and I'm fine with it. I think most "normal" users benefits from updates being a transparent process. Then again, some more technical users might be miffed if you update their app without permission, as I see it there's 3 options: 1) Have a checkbox when installing that says "allow automatic updates" 2) Just have a preference somewhere that allows you to "disable automatic updates" so that you have to "check for updates manually" I'm leaning towards 2) because 1) feels like it might alienate non-technical users and I'd rather avoid installation queries if possible. Also I'm thinking about making it easy to downgrade if an upgrade (heaven forbid) causes trouble, what are your thoughts? Another question, even if auto-updates are automatically, perhaps they should be announced. If there's new features for example otherwise you might not realize and use them One thing that kinda scares me though is the security implications, someone could theorically hack my server and push out spyware/zombieware to all my customers. It seems that using digital signatures to prevent man-in-the-middle attacks is the least you could do otherwise you might be hooked up to a network that spoofs the address of of update server.

    Read the article

  • Windows 8 Remote Desktop only allows one user at a time?

    - by segmentation fault
    I tried connecting to Windows 8 using its built-in Remote Desktop feature, but for some inexplicable reason, it requires that no users are logged in on the target machine before a remote user can log in. This has never been a problem with rdesktop on Unixen; I could rdesktop from as many machines as I wanted and any logged-in users would never notice a thing. What's the problem with Windows? Any way to allow concurrent local and remote logins to a Windows 8 machine without hacks or cracks? The "guides" on how to do this that show up in the Google results all suggest replacing a system DLL with a hacked one, but that's not acceptable.

    Read the article

  • How do spambots work?

    - by rlb.usa
    I have a forum that's getting hit a lot by forum spambots, and of course the best way to defeat something is to know thy enemy. I'll worry about defeating those spambots later, but right now I'd like to know more about them. Reading around, I felt surprised about the lack of thorough information on the subject (or perhaps my ineptness to input the correct search terms for better google results). I'm interested in learning all about spambots. I've asked on other forums and gotten brush-off answers like "Spambots are always users registering on your site." How do forum spambots work? How do they find the 'new user registration' page? (I'm especially surprised because some forums don't have a dedicated URL for this eg, www.forum.com/register.html , but instead use query strings or even other methods invisible to the URL bar) How do they know what to enter into each 'new user registration' field? How do they determine what's a page they can spam / enter data into and what is not? Do they even 'view' this page at all? ..If not, then I'd assume they're communicating with the server directly - how is - this possible? How do they do it? Can forum spambots break CAPTCHAs? Can they solve logic questions (how?)? Math questions? Do they reverse-engineer client-side anti-bot validation scripts? Server-side scripts? What techniques are still valid to prevent them? Where do spambots come from? Is someone sitting behind the computer snickering as they watch their bot destroy site after site? Or are they snickering as they simply 'release' it onto the internet somehow? Are spambots 'run' by an infected computer somewhere? Do they replicate themselves? etc

    Read the article

  • Is there a navigation app for iPad which re-calculate the route?

    - by earlyadopter
    iPad 3G successfully shows me current location, but google maps are not re-calculating the route if I did not follow exactly initially suggested by it. Normal auto navigators re-calculate on the fly. CoPilot Live HD app I see in the app store has very bad feedback. Do you know any other that are better, please? I need it with maps for the continental U.S., and being able to re-calculate depending on my real current location. I'd be OK even if it won't do that automatically, — I'd tap some button.

    Read the article

  • How can I use my own, external IP instead of localhost with Glassfish?

    - by Debopam
    I am using Glassfish v3 to develop couple of servlets. For testing the servlets the localhost:8080/MyServlet is working fine. But whenever I am using IP instead of localhost, its returning an error, saying: Oops! Google Chrome could not connect to xxx.xx.xx.xxx:8080. The address in the Glassfish is already set to 0.0.0.0. I even tried changing the 0.0.0.0 to my IP but the Glassfish is not starting saying the port is not empty. It was not a problem with XAMMP while I used it for PHP development. I am using Windows 7.

    Read the article

  • WiFi problems on several Ubuntu installations

    - by Rickyfresh
    Okay this is the first time I have ever had to ask a question as usually the Ubuntu community have answered everything already but on this occasion there are many people asking for the answer but not one good solution has become available so far so someone please help or I will have to install Windows on my sons and my girlfriends PCs and that would be a disaster as I am trying to help convince people to move from Windows. I installed 12.04 on three computers on the same day. Dell Inspiron (Works Perfect) Toshiba Satellite Home built Desktop The Dell works perfect but the other two either keep losing connection to the wireless Internet and even when they are connected they stop connecting to web sites, for some reason it searches Google fine but will not connect to web sites when a link is clicked. So far people have recommended in other forums: Removing network manager and installing wicd (didn't solve it) Changing the MTU in the wireless settings (didn't solve it) All sorts of messing about with Firefox settings (this doesn't solve it and even if it did this would leave most average PC users scratching their heads and wishing they had stuck to windows) The problem exists on two very different machines and different wireless cards so I doubt its a driver or hardware issue, also many other Ubuntu users are having the same problem with a vast array of different machines and wireless cards. Can someone please give a good solution to this as its going to turn a lot of people away from Ubuntu if they cannot get this sorted. I would give some PC specs but the two machines are vastly different and the other people complaining of this problem also have very different systems all showing the same problem.

    Read the article

  • Preventing apps to access info from wifi device?

    - by heaosax
    Browsers like Chrome and Firefox can use my wifi device to get information about the surrounded APs and pin point my physical location using Google Location Services, I know these browser always ask for permissions to do this, and that these features can also be "turned off". But I was wondering if there's a better way to prevent ANY application to access this information from my wifi device. I don't like anyone on the internet knowing where I live, and I am worried some software could do the same as these browsers but without asking for permissions. I am using Ubuntu 10.04.

    Read the article

  • Viewing zip archive contents using 'less' on OS X.

    - by multihead
    I couldn't help but notice that the 'less' program on all of the recent distributions of Linux that I've used (Ubuntu and Gentoo in this case) allow me to view the contents of ZIP and TAR archives, while the install of 'less' that I have on OS X (and Solaris) instead produce a "foo.zip may be a binary file. See it anyway?", which proceeds to spit out the raw binary data instead of a nice file structure listing. Google has not produced much in the way of helpful results -- it's tricky to search for 'less' in this context. I downloaded and built the latest version from greenwoodsoftware.com, but even it refuses to show the contents of these archives. I didn't come across any related configure/build options either. Any ideas? Thanks!

    Read the article

  • How can I monitor network traffic in an all Mac home network?

    - by raiglstorfer
    I have an all Mac network consisting of an Airport Extreme, 1 MacPro, 1 Mac Mini, 2 MackBook Pros, 2 iPads, and 2 iPhones. The Mac Pro is connected directly to the Airport Extreme via Cat5 and the rest is all running via Wireless. Lately I've been getting prompted by Google to enter Capchas frequently. The message states that I might have software running on my network I'm not aware of. My wireless router is password protected using WPA2 Personal and I frequently change my password so I don't think someone is using the network from outside (but I've no way to confirm this). I'm looking for a relatively cheap (preferably open source) solution that would enable me to monitor and profile the network usage by machine and port. Can someone recommend a solution?

    Read the article

  • Run 2008 R2 Service under 2000 Domain Account

    - by NoDisassemble
    I'm trying to get a service to run under a domain account. When I try to add the account, I get the error The account name is invalid or does not exist, or the password is invalid for the account name specified I know the account exists and the password is correct. I am also having trouble adding it manually to the "Log on as a service" setting, I get the error An extended error has occured. Failed to save Local Policy Database After a day of research I'm starting to suspect it has to do with it being a 2008 R2 server trying to use a 2000 domain account. I've tried to change the LAN Manager authentication level and the Minimum session security looks okay per my Google digging. I'm not sure what else I can do?

    Read the article

  • Chrome Residual Redirect to Login Page

    - by Shadow503
    My college redirects people in the dorms to a login page when using an ethernet (or wifi) connection. I am now at home, and certain domains keep redirecting to this login page. I've tried running ipconfig /flushdns and I flushed the chrome's local dns cache as described here: How to clear/flush the DNS cache in Google Chrome?. Interestingly enough, while http://www.reddit.com redirects to the login page, http://www.reddit.com/r/funny works. Firefox works fine for both urls. Is there a way to fix this without deleting all of my cookies? Thanks!

    Read the article

  • How can I monitor network traffic in an all Mac home network?

    - by raiglstorfer
    I have an all Mac network consisting of an Airport Extreme, 1 MacPro, 1 Mac Mini, 2 MackBook Pros, 2 iPads, and 2 iPhones. The Mac Pro is connected directly to the Airport Extreme via Cat5 and the rest is all running via Wireless. Lately I've been getting prompted by Google to enter Capchas frequently. The message states that I might have software running on my network I'm not aware of. My wireless router is password protected using WPA2 Personal and I frequently change my password so I don't think someone is using the network from outside (but I've no way to confirm this). I'm looking for a relatively cheap (preferably open source) solution that would enable me to monitor and profile the network usage by machine and port. Can someone recommend a solution?

    Read the article

  • De-duplicate Firefox bookmarks

    - by Zoredache
    What methods exist to de-duplicate Firefox bookmarks. As I search Google I find that there previously was a plugin called CheckPlaces, but that no longer seems to exist. Another popular suggestion seems to be AM-DeadLink, which I tried, but it completely trashed my bookmarks. (Fortunately I had a backup first, and yes I had closed Firefox first as instructed). I was trying to move all my youtube.com bookmarks into a folder. I tried doing a search, and then dragging the bookmarks into the folder. Apparently this creates a copy, instead of moving them as I expected. So now I have 3 of everything since I had tried a couple times.

    Read the article

  • bursty streaming video

    - by broiyan
    What is the cause and solution for the bursty streaming media problem? Example: when streaming from youtube, audio (and video) will pause and start intermittently. When it starts it will be bursty, that is, it will play several seconds of sound in just a fraction of a second. Normal sounds are rendered unrecognizable. Then it may pause and after a few seconds, resume with another burst. The video seems to burst along with the audio. This was observed on Ubuntu 12.04 with Google Chrome.

    Read the article

  • Why does everybody have the same MAC address as me?

    - by iblue
    I just bought some consumer-grade McCheap PCI-E NICs (and this was a bad idea). They both have the same MAC address. When I google it, it seems like every card of that company has the same address: 00:50:43:00:45:3e. Shouldn't they be unique? According to lspci it's a Marvell Technology Group Ltd. 88E8053 PCI-E Gigabit Ethernet Controller (rev 20). Is there a way to permanently flash a new address?

    Read the article

  • Aggressive Auto-Updating?

    - by MattiasK
    What do you guys think is best practice regarding auto-updating? Google Chrome for instance seems to auto-update itself as soon as it get's a chance without asking and I'm fine with it. I think most "normal" users benefits from updates being a transparent process. Then again, some more technical users might be miffed if you update their app without permission, as I see it there's 3 options: 1) Have a checkbox when installing that says "allow automatic updates" 2) Just have a preference somewhere that allows you to "disable automatic updates" so that you have to "check for updates manually" I'm leaning towards 2) because 1) feels like it might alienate non-technical users and I'd rather avoid installation queries if possible. Also I'm thinking about making it easy to downgrade if an upgrade (heaven forbid) causes trouble, what are your thoughts? Another question, even if auto-updates are automatically, perhaps they should be announced. If there's new features for example otherwise you might not realize and use them One thing that kinda scares me though is the security implications, someone could theorically hack my server and push out spyware/zombieware to all my customers. It seems that using digital signatures to prevent man-in-the-middle attacks is the least you could do otherwise you might be hooked up to a network that spoofs the address of of update server.

    Read the article

  • Find duplicate images?

    - by stefan.at.wpf
    I need a software (for Windows) that finds duplicate images by comparing the actual image content. I have duplicates of images, once with metadata, one without, so the image is the same, but the file is not, so comparing the files byte by byte is not enough. Another requirement is, that I can delete all / several duplicates at once - I don't want to click 100 times "delete"! That is, what I actually would have to do using XnView ): I also checked the other topics here and Google, but if a programscompares the images in a perfect way (like XnView) it doesn't allow the deletion of several duplicates at once

    Read the article

< Previous Page | 846 847 848 849 850 851 852 853 854 855 856 857  | Next Page >