Search Results

Search found 26427 results on 1058 pages for 'google scripts'.

Page 497/1058 | < Previous Page | 493 494 495 496 497 498 499 500 501 502 503 504  | Next Page >

  • Mac OS X Lion 10.7.2 update breaks SSL

    - by mcandre
    Summary After updating from 10.7.1 to 10.7.2, neither Safari nor Google Chrome can load GMail. Spinning Beachballs all around. The problem isn't GMail; Firefox loads GMail just fine. The problem isn't limited to Safari or Google Chrome; Other applications also have trouble with SSL: Gilgamesh and Safari. Any program that uses WebKit (Google Chrome, Safari) or a Cocoa library (Gilgamesh) to access the Internet has trouble loading secure sites. The various forums online suggest a handful of fixes, none of which work. Analysis Fix #1: Open Keychain Access.app and delete the Unknown certificate. The 10.7.2 update also prevents Keychain Access from loading. The Keychain program itself Spinning Beachballs. Fix #2: Delete ~/Library/Keychains/login.keychain and /Library/Keychains/System.keychain. This temporarily resolves the issue, and lets you load secure sites, but a minute or two after rebooting or hibernating somehow magically undoes the fix, so you have to delete these files over and over. Fix #3: Delete ~/Library/Application\ Support/Mob* and /Library/Application\ Support/Mob*. There is a rumor that the new MobileMe/iCloud service ubd is causing the issue. This fix does not resolve the issue. Fix #4: Open Keychain Access, open the Preferences, and disable OCSP and CRL. This fix does not resolve the issue. Fix #5: Use the 10.7.0 - 10.7.2 combo installer, rather than the 10.7.1 - 10.7.2 installer. When I run the combo installer, it stays forever at the "Validating Packages..." screen. The combo installer itself is bugged to He||. I force-quit the installer, ran "sudo killall installd" to force-quit the background installer process, and reran the combo installer. Same problem: it stalls at "Validing Packages..." Recap The only fix that works is deleting the keychains, but you have to do this every time you reboot or wake from hibernate. There is some evidence that ubd continually corrupts the keychain files, but the suggested ubd fix of deleting ~/Library/Application\ Support/Mob* and /Library/Application\ Support/Mob* does not resolve this issue. Evidently, something is corrupting the keychain over and over and over. Also posted on the Apple Support Communities.

    Read the article

  • Indexing data from multiple tables with Oracle Text

    - by Roger Ford
    It's well known that Oracle Text indexes perform best when all the data to be indexed is combined into a single index. The query select * from mytable where contains (title, 'dog') 0 or contains (body, 'cat') 0 will tend to perform much worse than select * from mytable where contains (text, 'dog WITHIN title OR cat WITHIN body') 0 For this reason, Oracle Text provides the MULTI_COLUMN_DATASTORE which will combine data from multiple columns into a single index. Effectively, it constructs a "virtual document" at indexing time, which might look something like: <title>the big dog</title> <body>the ginger cat smiles</body> This virtual document can be indexed using either AUTO_SECTION_GROUP, or by explicitly defining sections for title and body, allowing the query as expressed above. Note that we've used a column called "text" - this might have been a dummy column added to the table simply to allow us to create an index on it - or we could created the index on either of the "real" columns - title or body. It should be noted that MULTI_COLUMN_DATASTORE doesn't automatically handle updates to columns used by it - if you create the index on the column text, but specify that columns title and body are to be indexed, you will need to arrange triggers such that the text column is updated whenever title or body are altered. That works fine for single tables. But what if we actually want to combine data from multiple tables? In that case there are two approaches which work well: Create a real table which contains a summary of the information, and create the index on that using the MULTI_COLUMN_DATASTORE. This is simple, and effective, but it does use a lot of disk space as the information to be indexed has to be duplicated. Create our own "virtual" documents using the USER_DATASTORE. The user datastore allows us to specify a PL/SQL procedure which will be used to fetch the data to be indexed, returned in a CLOB, or occasionally in a BLOB or VARCHAR2. This PL/SQL procedure is called once for each row in the table to be indexed, and is passed the ROWID value of the current row being indexed. The actual contents of the procedure is entirely up to the owner, but it is normal to fetch data from one or more columns from database tables. In both cases, we still need to take care of updates - making sure that we have all the triggers necessary to update the indexed column (and, in case 1, the summary table) whenever any of the data to be indexed gets changed. I've written full examples of both these techniques, as SQL scripts to be run in the SQL*Plus tool. You will need to run them as a user who has CTXAPP role and CREATE DIRECTORY privilege. Part of the data to be indexed is a Microsoft Word file called "1.doc". You should create this file in Word, preferably containing the single line of text: "test document". This file can be saved anywhere, but the SQL scripts need to be changed so that the "create or replace directory" command refers to the right location. In the example, I've used C:\doc. multi_table_indexing_1.sql : creates a summary table containing all the data, and uses multi_column_datastore Download link / View in browser multi_table_indexing_2.sql : creates "virtual" documents using a procedure as a user_datastore Download link / View in browser

    Read the article

  • Why do some bad websites rank well?

    - by BradB
    Consider the following scenario: you are pitching SEO/Website Optimisation to a prospective client and you explain to them the importance of great copy and content, how acquiring links (ethically) can increase page rank, why the quality of the HTML build matters (H1, H2 tags, w3c validation etc), why keyword research is beneficial, you may drop in a few Google Webmaster Guideline or Matt Cutts references to back up your claims and rubbish the "back hat" approach as being no longer effective for good measure. Your advice is ethical and in the eyes of best practices, spot on. Then, the client points out to you some of their long established competitors on Google and you see these competitor websites ranking in the top spots (1 to 3) for medium to highly competitive search phrases that your client wants to compete for. These websites totally contradict your ethical approach and pretty much violate every best practice previously noted. They even out perform other "white hat" competitors who are in accordance with the above guidelines. I experienced this today. One of these well ranking websites had: About six microsites with more or less the same copy and a slightly varied layout Little or not textual content I would almost say duplicate content across the sites, but there was so little of it it could barely qualify for being duplicate All the content in Flash (with a music track that kicked in on each page load, not so much of an SEO issue - but it helps paint the picture) Keyword stuffing behind the Flash file with a bunch of black text on black background in the style of keyword 1 keyword 2,keyword1,keyword 2,keyword 2 keyword 3 and so on... The exact keyword stuffed combination present on every page of the website A bunch of clearly self made links from poor quality forums and directories with little or no Page Rank Links exchanged across the microsites How do you explain your way out of this when this hard evidence is sat in front of you undermining your great pitch?

    Read the article

  • Folders in SQL Server Data Tools

    - by jamiet
    Recently I have begun a new project in which I am using SQL Server Data Tools (SSDT) and SQL Server Integration Services (SSIS) 2012. Although I have been using SSDT & SSIS fairly extensively while SQL Server 2012 was in the beta phase I usually find that you don’t learn about the capabilities and quirks of new products until you use them on a real project, hence I am hoping I’m going to have a lot of experiences to share on my blog over the coming few weeks. In this first such blog post I want to talk about file and folder organisation in SSDT. The predecessor to SSDT is Visual Studio Database Projects. When one created a new Visual Studio Database Project a folder structure was provided with “Schema Objects” and “Scripts” in the root and a series of subfolders for each schema: Apparently a few customers were not too happy with the tool arbitrarily creating lots of folders in Solution Explorer and hence SSDT has gone in completely the opposite direction; now no folders are created and new objects will get created in the root – it is at your discretion where they get moved to: After using SSDT for a few weeks I can safely say that I preferred the older way because I never used Solution Explorer to navigate my schema objects anyway so it didn’t bother me how many folders it created. Having said that the thought of a single long list of files in Solution Explorer without any folders makes me shudder so on this project I have been manually creating folders in which to organise files and I have tried to mimic the old way as much as possible by creating two folders in the root, one for all schema objects and another for Pre/Post deployment scripts: This works fine until different developers start to build their own different subfolder structures; if you are OCD-inclined like me this is going to grate on you eventually and hence you are going to want to move stuff around so that you have consistent folder structures for each schema and (if you have multiple databases) each project. Moreover new files get created with a filename of the object name + “.sql” and often people like to have an extra identifier in the filename to indicate the object type: The overall point is this – files and folders in your solution are going to change. Some version control systems (VCSs) don’t take kindly to files being moved around or renamed because they recognise the renamed/moved file simply as a new file and when they do that you lose the revision history which, to my mind, is one of the key benefits of using a VCS in the first place. On this project we have been using Team Foundation Server (TFS) and while it pains me to say it (as I am no great fan of TFS’s version control system) it has proved invaluable when dealing with the SSDT problems that I outlined above because it is integrated right into the Visual Studio IDE. Thus the advice from this blog post is: If you are using SSDT consider using an Visual-Studio-integrated VCS that can easily handle file renames and file moves I suspect that fans of other VCSs will counter by saying that their VCS weapon of choice can handle renames/file moves quite satisfactorily and if that’s the case…great…let me know about them in the comments. This blog post is not an attempt to make people use one particular VCS, only to make people aware of this issue that might rise when using SSDT. More to come in the coming few weeks! @jamiet

    Read the article

  • Can't access some websites with any browser

    - by Charles Kingsmill
    I'm running Windows 7 64-bit on a new Samsung laptop and accessing the internet okay via ethernet cable to my university's ISP. Some sites work fine (e.g. google.com) but I can't access others at all (microsoft.com, topshop.com). I can't connect to those sites in safe mode with networking. And ping and tracert both fail. There's no proxy. Other users can connect successfully to these sites using my cable and socket. I've tried all the following with no success: using various browsers (IE9, FF, Chrome) creating a new user updating drivers clearing the DNS cache using OpenDNS and Google's DNS turning off Avast tweaking the MTU running MS malicious software removal tool running Spybot S&D reviewing the hosts file disabling the IPv6 options repairing / resetting winsock settings disabling advanced javascript options I have run out of ideas... can anyone see anything I've missed??!

    Read the article

  • Keeping websites from knowing where I live

    - by D Connors
    This questions is related to issues and practicality, not security. I live in Brazil and, apparently, every single website I visit knows about it. Usually that's okay. But there are quite a few sites that don't make use of that information adequately. For instance: Bing keeps thinking that Brazilian pages are way more relevant to me than American ones (which they're not). google.com always redirects me to google.com.br Microsoft automatically sends me to horribly translated support pages in Portuguese (which would just be easier to read in English). These are just a few examples. Usually it's stuff I can live with (or work around), but some of them are just plain irritating. I have geolocation disabled in Firefox, so I guess they're either getting this information from my IP or from Windows itself (which I bought here). Is there a way to avoid this? Either tell them nothing or make them think I live somewhere else?

    Read the article

  • Personal DNS server and fallback outside home

    - by Jens
    I have my own DNS server at home to access local names, and that is working fine. Then I have my laptop, now obviously my laptop leaves the home now and then, therefore it accesses different nets outside my home, and my DNS server is not accessible there... So I figured that I would just add Google as secondary DNS... But actually, when I do that, then suddenly I can't access my local stuff (at home that is, obviously), like my laptop is getting a quicker response from Google's DNS or something, because it can't find anything on the addresses I use locally. If I then remove the secondary DNS, and keeps my own, then it works fine again... So do I somehow need to seperate what DNS's to use on what nets? I already use sepperate DNS settings when I connect using my 3G modem, but when I use hotspots it seems to use the same settings regardless (at least in the train), also can it differ wired connections?... Is there another solution?

    Read the article

  • How do search engines handle hyphenated words?

    - by NinjaKC
    I am not sure my title fully explains what I mean. I thought this might be an interesting question. If I had a set of keywords, broken with a dash or 2, will search engines consider the dashed split keyword as maybe a full keyword? Say I have a site that sort of breaks words down, like the dictionary sites do. So a keyword for that page, might end up in the page, and / or the URL, as broken by dashes. Key-word = keyword Co-op-er-at-ive = cooperative Pho-to-gra-phy = Photography www.example.com/key-word/ www.example.com/co-op-er-at-ive/ www.example.com/pho-to-gra-phy/ I know search engines will consider a dash (at least Google) as a space, and understand it as multiple words. But in the English language, a dash can also break a word down (at least I think it can, can't it?), so will search engines also take this into consideration? I did a 'little' research, I Googled some words and placed random dashes, and it returned the words I searched for, but this could be considered a typo from the user on Google's search end, so really I am wondering if I can purposely put a dash in a keyword, and have the search engine spiders still catch that keyword as the real word without dashes? I've done a little Googling and looking here on Stackoverflow, but everything comes down to dashes for multiple words, not really the specific thing I'm trying to figure out. Hopefully that makes sense, I am not an expert in SEO, yet, but get the basics and have been playing, and this is just really a random question to satisfy my knowledge of playing :P

    Read the article

  • outlook signature problem

    - by user20989
    i have created signature for my windows mail signature html file has following code. <table border="0" cellspacing="0" cellpadding="0"> <tr> <td>Regards</td> </tr> <tr> <td><img src="http://images.google.com/intl/en_ALL/images/logos/images_logo_lg.gif" align="Signature Picture" /></td> </tr> </table> its working fine once i send message, but when if i foreword send item again i got signature but image not display. once i check the source of email i found following change in signature image automatically like <IMG height=128 alt="Compnay Logo" src="mhtml:{4B829C94-37FC-44B9-A60C-CC4BB1E0AE9B}mid://00000152/!http://images.google.com/intl/en_ALL/images/logos/images_logo_lg.gif" width=206 border=0> how to fix it. or any other way to put image in signature (not image is also hosted on web server) Thanks

    Read the article

  • Getting a lot of postmaster undeliverable notices for non-existent users

    - by Mike Walsh
    I've had my domain (straightpathsql.com) for a few years now. I host my e-mail with Google Accounts for business and have for awhile. ALl of the sudden in the past week I am starting to get a lot of postmaster delivery fail notices from various domains, most of them involving bogus e-mail addresses at my domain ([email protected], for example)... My assumption here is that someone is trying to relay on some other host (not my hosts which are secure through google apps for business, I presume) and there isn't much I can do to stop it. But I just want to make sure there isn't something else I need to be looking at here.. An example delivery fail notice is below.. I know nothing of those addresses below and they look like garbage... (Quick edit: the reason I get these messages is I set myself up as a catch all, so it doesn't matter what e-mail you send a note to at my domain, I'll get it if the account isn't setup... All of the failure messages are sent to bogus addresses on my domain) The following message to <[email protected]> was undeliverable. The reason for the problem: 5.1.0 - Unknown address error 553-'sorry, this recipient is in my badrecipientto list (#5.7.1)' Final-Recipient: rfc822;[email protected] Action: failed Status: 5.0.0 (permanent failure) Remote-MTA: dns; [118.82.83.11] Diagnostic-Code: smtp; 5.1.0 - Unknown address error 553-'sorry, this recipient is in my badrecipientto list (#5.7.1)' (delivery attempts: 0) ---------- Forwarded message ---------- From: Howard Blankenship <[email protected]> To: omiivi2922 <[email protected]> Cc: Date: Subject: Hi omiivi2922

    Read the article

  • Why does my mail get marked as spam?

    - by schoen
    I Have the server "afspraakmanager.be". It matches everything not to be a spam server.(it isn't by the way): it has reverse dns, spf,dkim,... . But hotmail marks it as spam. I think the problem is the SPF/DKIM records. when i sent an email to my gmail it says: "Received-SPF: neutral (google.com: 2a02:348:8e:6048::1 is neither permitted nor denied by best guess record for domain of [email protected]) client-ip=2a02:348:8e:6048::1; Authentication-Results: mx.google.com; spf=neutral (google.com: 2a02:348:8e:6048::1 is neither permitted nor denied by best guess record for domain of [email protected]) [email protected]; dkim=neutral (bad format) [email protected]" So i guess my SPF and DKIM records aren't set up right. But I also don't have a clue what is wrong with them. this is the zone file: ; zone file for afspraakmanager.be $ORIGIN afspraakmanager.be. $TTL 3600 @ 86400 IN SOA ns1.eurodns.com. hostmaster.eurodns.com. ( 2013102003 ; serial 86400 ; refresh 7200 ; retry 604800 ; expire 86400 ; minimum ) @ 86400 IN NS ns1.eurodns.com. @ 86400 IN NS ns2.eurodns.com. @ 86400 IN NS ns3.eurodns.com. @ 86400 IN NS ns4.eurodns.com. ; Mail Exchanger definition @ 600 IN MX 10 smtp ; IPv4 Address definition @ IN A 37.230.96.72 afspraakmanager.be 600 IN A 37.230.96.72 localhost 86400 IN A 127.0.0.1 smtp 600 IN A 37.230.96.72 www 600 IN A 37.230.96.72 ; Text definition default._domainkey 600 IN TXT "v=DKIM1\\; k=rsa\\; p=MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQC6pvlZKnbSVXg1Bf3MF2l8xRrKPmqIw2i9Rn1yZ3HEny9qH1vyGXUjdv2O0aQbd5YShSGjtg5H/GedRMLpB0Qb+hBj1yGofOQTdcVtZZfj8qBY5Z7vEkhvtdaogQ0vLjgcwhg0BBuTewEkLxrl9IIzkPMZ1SCtM2Y0RtiUhg2cjQIDAQAB" ; Sender Policy Framework definition afspraakmanager.be 600 IN SPF "v=spf1 a mx ptr +all" The DKIM signature in the header: DKIM-Signature: v=1; a=rsa-sha256; c=simple/simple; d=afspraakmanager.be; s=mail; t=1382361029; bh=4pDpXBY8rCbX8+MfrklZzpQxaUsa3vSPUYjcDR3KAnU=; h=Date:From:To:Subject:From; b=SoBBaAlrueD8qID8txl2SBSqnZgN2lkPCdSPI/m7/YLezIcBedkgIX1NswYiZFl6Z AmF8dES73WUaaJjItVHSrdCJK2mJ/Az+vrgNsyk+GqZZ1YPiIlH3gqRrsguhoofXUX /gqLlqsLxqxkKKd9EbSzKRHuDGlJCLm5SlL8wnL0=

    Read the article

  • Read On Phone Pushes Data from Your Desktop to the Appropriate Android App

    - by ETC
    Read On Phone is a free Android application that intelligently pushes data to your phone from your bowser. Rather than simply opening the URL on your phone, it opens the appropriate application for the task and formats text. Most send-to-phone type tools simply take the URL of the web page you’re looking at on your computer and shuttle it to your phone. Read On Phone is a more active and effective tool. When you send a page that is text, it formats the text for easy reading on your phone. When you send a YouTube video, map, or telephone number, it opens up the appropriate tool on your phone such as your YouTube viewer, Google Maps, or your phone dialer. In addition to that handy functionality Read On Phone also includes adjustments for day and night reading, font size, auto-scrolling, and pagination. Read On Phone is available as both a Chrome extension and as a bookmarklet for cross-browser use. Hit up the link below for additional information. Read On Phone Latest Features How-To Geek ETC Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions How to Enable User-Specific Wireless Networks in Windows 7 How to Use Google Chrome as Your Default PDF Reader (the Easy Way) Read On Phone Pushes Data from Your Desktop to the Appropriate Android App MetroTwit is a Sleek Native Twitter Client for Your Windows System Make Efficient Use of Tab Bar Space by Customizing Tab Width in Firefox See the Geeky Work Done Behind the Scenes to Add Sounds to Movies [Video] Use a Crayon to Enhance Engraved Lettering on Electronics Adult Swim Brings Their Programming Lineup to iOS Devices

    Read the article

  • Should GeeksWithBlogs move to the Wordpress Platform?

    - by Martin Hinshelwood
    Geekswithblogs was my first ever blog and my first post was on 22nd June 2006. Since then very little functionality has been added. This is not a complaint, but rather an observation that it is very hard to keep up with all of the blogging capabilities that people want. My point would be: “Why bother!” Vote now for a migration of the awesome Geekswithblogs content from SubText to Wordpress. Having been a long time user of GWB I have been worried of late by my envy of other blogging platforms. I made a number of requests around 10 months ago for things that almost all blogging platforms provide, but which are not available on GWB. Support other comment frameworks – The current comment system is so antiquated that it does not even have the common filters like Facebook, twitter or Google login to help prevent spam. Tags are not listed in the RSS – This can prevent you from getting the Google juice that you deserve 301 redirects to a single URL– If you use a custom URL then all your posts are split between both the GWB URL and the custom one. This is VERY bas for SEO. I realise that it is difficult to find time to add all of the features that all the uber geeks on this site want, so why bother… lets move to the most popular and moded platform available and allow everyone to add whatever widgets they like. Figure: Vote for Geekswithblogs moving to Wordpress Why would I want this? There are over 13k plugins available Easy augmentation model Full mobile support Regular releases lots more… This could be a turning point in the legendary history of Geeks With Blogs, be part of it… What can I do to make this happen? I need your help to make this happen: Vote for it Discuss it

    Read the article

  • Running own DNS Server for learning purpose

    - by sundar22in
    I would like to run my own DNS server in my laptop for learning purpose. I recently used Google Public DNS and liked it. I wanted to build some thing similar and small for my web browsing. What I vaguely dream of is to use my own DNS server as Primary DNS server and Google public DNS as secondary DNS server. I would like to build my own DNS server gradually by editing the configuration files (If it can be automated it will be great, but have no clues there). Sometimes it sounds like a stupid idea to me, but I am fine with editing config file for each site I want to add to my DNS server. Any pointers/suggestion is welcome.

    Read the article

  • Transaction classification. Artificial intelligence

    - by Alex
    For a project, I have to classify a list of banking transactions based on their description. Supose I have 2 categories: health and entertainment. Initially, the transactions will have basic information: date and time, ammount and a description given by the user. For example: Transaction 1: 09/17/2012 12:23:02 pm - 45.32$ - "medicine payments" Transaction 2: 09/18/2012 1:56:54 pm - 8.99$ - "movie ticket" Transaction 3: 09/18/2012 7:46:37 pm - 299.45$ - "dentist appointment" Transaction 4: 09/19/2012 6:50:17 am - 45.32$ - "videogame shopping" The idea is to use that description to classify the transaction. 1 and 3 would go to "health" category while 2 and 4 would go to "entertainment". I want to use the google prediction API to do this. In reality, I have 7 different categories, and for each one, a lot of key words related to that category. I would use some for training and some for testing. Is this even possible? I mean, to determine the category given a few words? Plus, the number of words is not necesarally the same on every transaction. Thanks for any help or guidance! Very appreciated Possible solution: https://developers.google.com/prediction/docs/hello_world?hl=es#theproblem

    Read the article

  • Prevent mail flagged as spam when switching mail servers (new SPF records)?

    - by Jakobud
    For our business, we send out a significant amount of newsletter alerts to customers that sign up for it on our website. We used to send this mail directly from our web server via PHP. But because the web server limited us to the number of emails we could send per day, we purchased a VM server at a different host (that doesn't throttle email) and we are going to use that account solely for sending out the emails. Anyways, now that the SPF records are going to be different from what they used to be and the source mail server is different, what steps need to be taken to prevent these emails being flagged as spam? I know in Gmail, it's pretty smart about determining if the person actually sending the email is sending it from the server it expects (for flagging Phishing emails, etc). We don't want that to happen to our emails. Just sending a couple test emails out, Gmail's shows the SPF record saying: Authentication-Results: mx.google.com; spf=neutral (google.com: XXX.XXX.23.176 is neither permitted nor denied by best guess record for domain of [email protected]) [email protected] So is there anything we need to do with regards to SPF records as we move forward?

    Read the article

  • Yahoo search: different results shown in two identical searches

    - by Marco Demaio
    Hello,simple question: searching on http://www.yahoo.it for villa matrimonio bologna I noticed Yahoo shows different results. You need to retry few times to get this done maybe exiting the browser and openeing it again, or maybe searching once and then clearing browser cookies and then search again (it's even easier to test if you use two different browsers at the same time to search for the same phrase). Anyway in order to reproduce this easily I write down here the query shown in the address bar after the search, so you can just click on these to see the results shown by entering these query: http://it.search.yahoo.com/search;_ylt=AirvLYKvBMPP_6MpAmONN14brK5_?vc=&p=villa+matrimonio+bologna&toggle=1&cop=mss&ei=UTF-8&fr=yfp-t-709 http://it.search.yahoo.com/search;_ylt=AirvLYKvBMPP_6MpAmONN14brK5_?vc=&p=villa+matrimonio+bologna&toggle=1&cop=mss&ei=UTF-8&fr=sfp Note the last parameter fr is different, but it's Yahoo that set it (not me), I don't even know what it means. You can see in the search box that the searched phrase is IDENTICAL in both cases. So why Yahoo is giving out different results on same search phrase? I used the same browser and performed the test in few minutes by simply trying more than once. You may also notice that the number of results returned (written on the left side of the page) is different, for the 1st search it returns 274K results, for the 2nd one 5.38M results. Actually you might think that this is just an error on Yahoo, but it's almost 1 year that while looking once in a while at some websites to see how they are ranking on Yahoo and also Google, I noticed that two searches on the same phrase show up different results even on the same day after few minutes/hours. I couldn't reproduce this behaviour also on Google so I can not say for sure, but since it seems to me it happened sometimes I was wondering if anyone of you noticed it too. Do you know if this is the normal behaviour of search engines? Because if it's normal (and it's just me that noticed it only now) I wonder how do you understand how well a site is ranking on a search engine, you could even see one of your customer's website ranking differently compared to what your customer sees on his PC.

    Read the article

  • What is a currently a good game stack for simple Javascript 2D multiplatform game?

    - by JacobusR
    I'm looking for advice from someone can help me avoid common pitfalls in developing light weight, quick-to-market games. Yeah, I've heard of Google ;-), but a trip down Google lane does not beat solid experience from someone who has been down this path. I'm looking for advice from someone who works alone, or in a small team, and has developed some 2D games for mobile. My game ideas don't require intensive graphics, just simple arcade style glyphs and collision detection. My experience is mostly with Scala, Java and web technologies (Javascript, CSS, SVG, HTML, etc). My question is: Is there a nice stack that someone can suggest that will be a good fit for my skillset? I'm considering Javascript for simple 2D shooter games with simple multiplayer games being supported with a Scala server-side written on Spray. Is this silly? Should I rather look into things such as Unity 3D, and use it in 2D mode? For the actual game engine, something like the Sparrow Framework would be great, but it needs to be multiplatform.

    Read the article

  • Rendering composite/SVideo input with GraphEdit (comp./svid. source for crossbar)?

    - by Synetech
    Does anyone know how to form a GraphEdit graph to render composite/SVideo input (especially for a Hauppauge or AIW card)? Google (and Google Images) finds only results focusing on rendering the TV tuner which is already simple enough. The tv-tuner-in pins connect to the TvTuner/TvAudio source, but nothing seems to be able to connect to the Composite/SVideo pins. I have looked through the filters and could find no sources to connect to the Composite/SVideo input pins of the crossbar; GraphEdit always complains that they are not compatible.

    Read the article

  • Are there videocamera which geotag individual frames?

    - by Grzegorz Adam Hankiewicz
    I'm looking for a way to record live video with the specific requirement of having each frame georeferenced with GPS. Right now I'm using a normal video camera with a PDA+GPS that records the position, but it's difficult to sync both of these plus sometimes I've forgotten to turn the PDA+GPS or it has failed for some reason and all my video has been useless. Using google I found that about two years ago a company named Seero produced such video cameras and software, but apparently the domain doesn't exist any more and I only find references of other pages mentioning it. Does somebody know of any other product? I need to record this video in HD and have some way to export to Google Maps or other GIS software the positions of the frames in a way that I can click on the map and see what was being recorded in the video at that point. The precission of the GPS tracking is good enough as one position per second, intermediate frames of the video stream can be interpolated.

    Read the article

  • What is the best way to restrict access to adult content on Ubuntu?

    - by Stephen Myall
    I bought my kids a PC and installed 12.04 (Unity) on it. The bottom line is, I want my children to use the computer unsupervised while I have confidence they cannot access anything inappropriate. What I have looked at: I was looking at Scrubit a tool which allows me configure my wifi router to block content and this solution would also protect my other PC and mobile devices. This may be overkill as I just want the solution to work on one PC. I also did some Google searches and came across the application called Nanny (it seems to look the part). My experience of OSS is that the best solutions frequently never appear first on a Google search list and in this case I need to trust the methods therefore my question is very specific. I want to leverage your knowledge and experience to understand “What is the best way to restrict adult content on 12.04 LTS” as this is important to me. It maybe a combination of things so please don't answer this question "try this or that", then give me some PPA unless you can share your experience of how good it is and of course if there are any contraints. Thanks in advance

    Read the article

  • Kubuntu 12.04 - DNS Issues

    - by AndrewJesaitis
    Starting yesterday (6/11/12), I've been having many network problems. When requesting a page in chrome, the page hangs on "Sending request" and then will eventually timeout. I'm within a VPN that has it's own DNS server. I've tried to manually set my DNS through the Network-Manager applet and by editing /etc/network/interfaces. Having no luck I unlinked the resolv.conf file and dumped the contents of my old resolv.conf into it. Again having no luck, I deactivated the dnsmasq server in /etc/NetworkManager/NetworkManager.conf by commenting out the dns=dnsmasq. $ cat NetworkManager.conf [main] plugins=ifupdown,keyfile #dns=dnsmasq no-auto-default=D0:67:E5:EA:B6:6B, [ifupdown] managed=false $ nm-tool NetworkManager Tool State: connected (global) - Device: eth0 [Wired connection 1] ------------------------------------------- Type: Wired Driver: tg3 State: connected Default: yes HW Address: D0:67:E5:EA:B6:6B Capabilities: Carrier Detect: yes Speed: 1000 Mb/s Wired Properties Carrier: on IPv4 Settings: Address: 192.168.254.122 Prefix: 24 (255.255.255.0) Gateway: 192.168.254.2 DNS: 192.168.254.1 What is strange is that the network will work fine for a few minutes then start to timeout. A few minutes later it will work again. I'm unable to hit internal or external sites when it is timing out. When I $dig local sites, I receive no answer. I do receive an answer from google.com. At this point, I would usually blame the DNS Server, especially since when I change to Google's DNS server things work. But, I need to use our internal DNS to hit our internal sites. Nobody else is having issues and they are all using DHCP. This group includes one user who is using 11.04. At this point, I'm at a loss for what to do, so any help would be appreciated.

    Read the article

  • Share on: FB, Tweet, Digg, Linkedin, Delicious, My mother, ... it's just on fashion, or some real value?

    - by Marco Demaio
    Nowadays your site is not in fashion if you don't show at least a couple of share buttons like these: Is this just fashion, or do people actually get something good out of it? When I say "something good" I mostly mean something that you could measure, and not just the feeling that was good. Maybe I can better explain with an example: did you notice (in some way) that many people clicked on those links to share your page/s on those web 2.0 social sites? And in such a case on which social networks did you see they mostly share your pages? BTW I'm not talking about Google PR, i know all web 2.0 social sites use nofollow everywhere and even hidden links, so they are useless by themselves for PR. UPDATE: According to this video, Google's Alter Ego says that they now use in some way data from social sites in ranking. If this is true, it's obvious that the Share on button for FB, Tweet, etc are definitely of some values. But again my question is more about what you noticed in your real experience to be a direct benefit of adding those type of "Share On" links on your webisite? I.e. did you see more traffic coming in form FB, or some users who bought your products because of FB or Twitter? Or any other benefits? Thanks

    Read the article

  • The ping response time doesn't reflect the real network response time

    - by yangchenyun
    I encountered a weird problem that the response time returned by ping is almost fixed at 98ms. Either I ping the gateway, or I ping a local host or a internet host. The response time is always around 98ms although the actual delay is obvious. However, the reverse ping (from a local machine to this host) works properly. The following is my route table and the result: route -n Kernel IP routing table Destination Gateway Genmask Flags Metric Ref Use Iface 0.0.0.0 192.168.1.1 0.0.0.0 UG 100 0 0 eth1 60.194.136.0 0.0.0.0 255.255.255.0 U 0 0 0 eth0 169.254.0.0 0.0.0.0 255.255.0.0 U 1000 0 0 eth1 192.168.1.0 0.0.0.0 255.255.255.0 U 0 0 0 eth1 # ping the gateway ping 192.168.1.1 PING 192.168.1.1 (192.168.1.1) 56(84) bytes of data. 64 bytes from 192.168.1.1: icmp_req=1 ttl=64 time=98.7 ms 64 bytes from 192.168.1.1: icmp_req=2 ttl=64 time=97.0 ms 64 bytes from 192.168.1.1: icmp_req=3 ttl=64 time=96.0 ms 64 bytes from 192.168.1.1: icmp_req=4 ttl=64 time=94.9 ms 64 bytes from 192.168.1.1: icmp_req=5 ttl=64 time=94.0 ms ^C --- 192.168.1.1 ping statistics --- 5 packets transmitted, 5 received, 0% packet loss, time 4004ms rtt min/avg/max/mdev = 94.030/96.149/98.744/1.673 ms #ping a local machine ping 192.168.1.88 PING 192.168.1.88 (192.168.1.88) 56(84) bytes of data. 64 bytes from 192.168.1.88: icmp_req=1 ttl=64 time=98.7 ms 64 bytes from 192.168.1.88: icmp_req=2 ttl=64 time=96.9 ms 64 bytes from 192.168.1.88: icmp_req=3 ttl=64 time=96.0 ms 64 bytes from 192.168.1.88: icmp_req=4 ttl=64 time=95.0 ms ^C --- 192.168.1.88 ping statistics --- 4 packets transmitted, 4 received, 0% packet loss, time 3003ms rtt min/avg/max/mdev = 95.003/96.696/98.786/1.428 ms #ping a internet host ping google.com PING google.com (74.125.128.139) 56(84) bytes of data. 64 bytes from hg-in-f139.1e100.net (74.125.128.139): icmp_req=1 ttl=42 time=99.8 ms 64 bytes from hg-in-f139.1e100.net (74.125.128.139): icmp_req=2 ttl=42 time=99.9 ms 64 bytes from hg-in-f139.1e100.net (74.125.128.139): icmp_req=3 ttl=42 time=99.9 ms 64 bytes from hg-in-f139.1e100.net (74.125.128.139): icmp_req=4 ttl=42 time=99.9 ms ^C64 bytes from hg-in-f139.1e100.net (74.125.128.139): icmp_req=5 ttl=42 time=99.9 ms --- google.com ping statistics --- 5 packets transmitted, 5 received, 0% packet loss, time 32799ms rtt min/avg/max/mdev = 99.862/99.925/99.944/0.284 ms I am running iperf to test the bandwidth, the rate is quite low for a LAN connection. iperf -c 192.168.1.87 -t 50 -i 10 -f M ------------------------------------------------------------ Client connecting to 192.168.1.87, TCP port 5001 TCP window size: 0.06 MByte (default) ------------------------------------------------------------ [ 4] local 192.168.1.139 port 54697 connected with 192.168.1.87 port 5001 [ ID] Interval Transfer Bandwidth [ 4] 0.0-10.0 sec 6.12 MBytes 0.61 MBytes/sec [ 4] 10.0-20.0 sec 6.38 MBytes 0.64 MBytes/sec [ 4] 20.0-30.0 sec 6.38 MBytes 0.64 MBytes/sec [ 4] 30.0-40.0 sec 6.25 MBytes 0.62 MBytes/sec [ 4] 40.0-50.0 sec 6.38 MBytes 0.64 MBytes/sec [ 4] 0.0-50.1 sec 31.6 MBytes 0.63 MBytes/sec

    Read the article

  • Can't access any functions after chown command.

    - by explorex
    I am not being to access any functions in my desktop and I don't have an OS besides Ubuntu and I am new to Ubuntu. I think I rebooted my computer thinking that Google Chrome crashed. I opened Google Chrome but it showed opening message but never opened so I restarted my computer. and when my system was loading (I was playing with keyboard dont know what I typed) and when by Ubuntu loaded, I was unable to access anything some of characteristics are listed below: I cannot hear any sound I cannot access wired ethernet connection on the right corner where I usually enable to access internet and I have no internet. There is no local apache server either. when ever I try to start apacer I get setuid must be root or something. When I type sudo then I get message setuid must be root. I cannot access orther external storage devices like pendrive and portable hard drive and cannot mount my other drives with FAT32 filesystem. When I try to start my apache webserver with out typing sudo then I get message cannnot open socket or something like it. I remember also doing command chown -R www-data / earlier and got error message I cannot shutdown my computer, it only logs off

    Read the article

< Previous Page | 493 494 495 496 497 498 499 500 501 502 503 504  | Next Page >