Search Results

Search found 11896 results on 476 pages for 'smart pro'.

Page 110/476 | < Previous Page | 106 107 108 109 110 111 112 113 114 115 116 117  | Next Page >

  • Gracefully terminate a request based service on server

    - by Jatin
    In our web application, for each http-request there is a lot of computation that happens on back end. Output can vary from 10 sec - 1 Hour. In the mean time when it is computed, "Waiting.." is shown on the website for the respective user. But it so happens, that a user might cut down the service in between. So what all can be done on the back end so that the computation can be stopped in between to save resources? What different tactics can be applied here? And if better (instead of killing the thread directly), then a graceful termination policy should make wonders.

    Read the article

  • How can I screen clients that try to register multiple times?

    - by Aba Dov
    My company offers a bonus to every client that register. We would like to prevent people from abusing this by registering several times. we thought about filtering clients by ip (there is a problem with workplaces where all stations have the same ip) cookies (if cookies are not allowed we might lose a client) I would like your opinions on these two methods and will be glad to hear about new ones. thanks

    Read the article

  • How do I prevent tampering with AJAX process page? [closed]

    - by whamsicore
    I am using Ajax for processing with JQUERY. The Data_string is sent to my process.php page, where it is saved. Issue: right now anyone can directly type example.com/process.php to access my process page, or type example.com/process.php/var1=foo1&var2=foo2 to emulate a form submission. How do I prevent this from happening? Also, in the Ajax code I specified POST. What is the difference here between POST and GET?

    Read the article

  • Aligning text to the bottom of a div: am I confused about CSS or about blueprint? [closed]

    - by larsks
    I've used Blueprint to prototype a very simple page layout...but after reading up on absolute vs. relative positioning and a number of online tutorials regarding vertical positioning, I'm not able to get things working the way I think they should. Here's my html: <div class="container" id="header> <div class="span-4" id="logo"> <img src="logo.png" width="150" height="194" /> </div> <div class="span-20 last" id="title"> <h1 class="big">TITLE</h1> </div> </div> The document does include the blueprint screen.css file. I want TITLE aligned with the bottom of the logo, which in practical terms means the bottom of #header. This was my first try: #header { position: relative; } #title { font-size: 36pt; position: absolute; bottom: 0; } Not unexpectedly, in retrospect, this puts TITLE flush left with the left edge of #header...but it failed to affect the vertical positioning of the title. So I got exactly the opposite of what I was looking for. So I tried this: #title { position: relative; } #title h1 { font-size: 36pt; position: absolute; bottom: 0; } My theory was that this would allign the h1 element with the bottom of the containing div element...but instead it made TITLE disappear, completely. I guess this means that it's rendering off the visible screen somewhere. At this point I'm baffled. I'm hoping someone here can point me in the right direction. Thanks!

    Read the article

  • Preventing indexing duplicate content by search engines

    - by umesh awasthi
    I am in process of migrating my old domain (www.oldurl.com) to new domain (www.newurl.com). Almost all the content,URL structure as well database is same except for few URL's and only difference will be in the domain name. I have made entries in the Apache's .htaccess file to set 301 redirect and currently have blocked all search engines from crawling my new domain by setting in robot.txt file. I am not sure how i will handle the duplicate content issue as when i will make the new domain go live. Should i block search engines to index/crawl my old domain? i am new to this field and not sure if this is actually any duplicate content issue or not.

    Read the article

  • How to interpret number of URL errors in Google webmaster tools

    - by user359650
    Recently Google has made some changes to Webmaster tools which are explained below: http://googlewebmastercentral.blogspot.com/2012/03/crawl-errors-next-generation.html One thing I could not find out is how to interpret the number of errors over time. At the end of February we've recently migrated our website and didn't implement redirect rules for some pages (quite a few actually). Here is what we're getting from the Crawl errors: What I don't know is if the number of errors is cumulative over time or not (i.e. if Google bots crawl your website on 2 different days and find 1 separate issue on each day, whether they will report 1 error for each day, or 1 for the 1st, and 2 for the 2nd). Based on the Crawl stats we can see that the number of requests made by Google bots doesn't increase: Therefore I believe the number of errors reported is cumulative and that an error detected on 1 day is taken into account and reported on the subsequent days until the underlying problem is fixed and the page it's crawled again (or if you manually Mark as fixed the error) because if you don't make more requests to a website, there is no way you can check new pages and old pages at the same time. Q: Am I interpreting the number of errors correctly?

    Read the article

  • I need a little help with .htaccess rewrite

    - by Pinokyo
    I need a little help with .htaccess file I have songs, singers and albums links I want to rewrite. I all ready rewrote the links and they are like this: the links for the songs is like this: /song/song_name for singers: /singer_name for albums: /album_name From my .htaccess file: RewriteEngine on RewriteRule ^singer/([^/\.]+)/?$ /core/controller.php?singer=$1 [L] RewriteRule ^song/([^/\.]+)/?$ /core/controller.php?song=$1 [L] RewriteRule ^album/([^/\.]+)/?$ /core/controller.php?album=$1 [L] I need the links for the songs, singers and albums to be like this: for songs /singer_name/song_name for singers /singer_name for albums /singer_name/album_name can anyone help me with this please.

    Read the article

  • Meta tags again. Good or bad to use them as page content?

    - by Guandalino
    From a SEO point of view, is it wise to use exactly the same page title value and keyword/description meta tag values not only as meta information, but also as page content? An example illustrates what I mean. Thanks for any answer, best regards. <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN" "http://www.w3.org/TR/html4/strict.dtd"> <html> <head> <title>Meta tags again. Good or bad to use them as page content?</title> <meta name="DESCRIPTION" content="Why it is wise to use (or not) page title, meta tags description and keyword values as page content."> <meta name="KEYWORDS" content="seo,meta,tags,cms,content"> </head> <body> <h1>Meta tags again. Good or bad to use them as page content?</h1> <h2>Why it is wise to use (or not) page title, meta tags description and keyword values as page content.</h2> <ul> <li><a href="http://webmasters.stackexchange.com/questions/tagged/seo">seo</a> <li><a href="http://webmasters.stackexchange.com/questions/tagged/meta">meta</a> <li><a href="http://webmasters.stackexchange.com/questions/tagged/tags">tags</a> <li><a href="http://webmasters.stackexchange.com/questions/tagged/cms">cms</a> <li><a href="http://webmasters.stackexchange.com/questions/tagged/content">content</a> </ul> <p>Read the discussion on <a href="#">webmasters.stackexchange.com</a>. </body> </html>

    Read the article

  • What liability concerns do advertising vendors raise, and how can I address them?

    - by Beofett
    One of the websites I administer wants to provide free advertising in the form of direct links to vendors at an event they are running. Up until now, there has been no advertising whatsoever on the site (or any of our other sites). The site is for a for-profit business. The idea of implicit endorsement of any vendors we advertise has been raised, which brought up the question of what we need to do, if anything, to protect ourselves from any potential problems such endorsement might create. I know that many sites have clauses in their Terms of Service that state that (in a nutshell) they are not responsible for any problems or grievances between the visitors to the site and any vendor advertised or linked. Are there other steps that a website typically takes when considering advertising, such as getting the advertiser to provide some sort of certification that their ad will not violate any trademarks or copyrighted material?

    Read the article

  • Screencast several application windows at once in Microsoft Windows

    - by Birt
    I have several (20+) applications running on a Microsoft Windows PC. What I would like is a solution that allows me to broadcast the window of each application in a webpage, in readonly mode (there's no need for the users to interact with it). This should work even if the application is in the background, seeing that there's no way to fit all of them on the screen. I performed very extensive searching, from simple screencasting apps such as Camtasia, CamStudio or VHScrCap to things like VNC (haven't found any server able to broadcast multiple windows at once, much less background windows) and even application virtualization, but in the end I haven't found anything that fits my needs. Most solutions that allow capturing a window instead of the whole desktop will not let you capture multiple windows but only a single window and on top of that they don't even work when the window is in the background.

    Read the article

  • How to prevent access to website without SSL connection?

    - by CraigJ
    I have a website that has an SSL certificate installed, so that if I access the website using https instead of http I will be able to connect using a secure connection. However, I have noticed that I can still access the website non-securely, ie. by using http instead of https. How can I prevent people using the website in a non-secure manner? If I have a directory on the website, eg. samples/, can I prevent non-secure connections to just this directory?

    Read the article

  • Is a backlink with a duplicate description and title from a news site bad for SEO?

    - by Dejan Pelzel
    I have a blog with over a thousand posts. I have posted some of those to a news aggregator site and included the same preview photo and description that I used for it on my own site and the link to the post on my site. Since the site is mainly videos and images, the description was usually a complete match of 4-6 lines of text. It now looks that I have been affected by panda and since I am not doing any bad stuff, I suspect it might be due to duplicate content. For example, when I search the title of my posts, sometimes my site is not even returned, but the news aggregator site is. Could this be the problem with panda?

    Read the article

  • What is the best way to construct a "remove multiple items" area (ASP.NET VB) [on hold]

    - by Darkcat Studios
    Lets say for example I have a (variable length) 2 dimensional array of product names and their unique product codes. I can display this list in a datagrid, table etc. (Imagine this as a standard shopping basket type scenario) What I need to do is be able to tick multiple items (?) , then on clicking a submit button, fire an action. The bit im struggling with is how do i: A: programatically display asp:checkboxes for each item (and give them a unique ID) B: know which are ticked on firing the final action (not sure if this question is best suited to the main stack but theres so much activity on there that questions just get lost now!)

    Read the article

  • Dropped impression 25 days after restructure

    - by Hamid
    Our website is a non English property related website (moshaver.com) which is similar to rightmove.co.uk. On September 2012 our website was adversely affected by Panda causing our Google incoming clicks to drop from around 3000 clicks to less than a thousand. We were hoping that Google will eventually realize that we are not a spam website and things will get better. However, in August 2013 we were almost sure that we needed to do something, so we started to restructure our web content. We used the canonical tag to remove our search results and point to our listing pages, using the noindex tag to remove it from our listing pages which does not have any properties at the moment. We also changed title tags to more friendly ones, in addition to other changes. Our changes were effective on 10th August. As shown in the graph taken from Google Analytics Search Engine Optimization section, these changes has resulted in an increase in the number of times Google displayed our results in its search results. Our impressions almost doubled starting 15th August. However, as the graph shows, our CTR dropped from this date from around 15% to 8%. This might have been because of our changed title tags (so people were less likely to click on them), or it might be normal for increased impressions. This situation has continued up until 10th September, when our impressions decreased dramatically to less than a thousand. This is almost 30% of our original impressions (before website restructure) and 15% of the new impressions. At the same time our impressions has increased dramatically to around 50%. I have two theories for this increase. The first one is that these statistics are less accurate for lower impressions. The second one is that Google is now only displaying our results for queries directly related to our website (our name, our url), and not for general terms, such as "apartments in a specific city". The second theory also explains the dramatic decrease in impression as well. After digging the analytic data a little more, I constructed the following table. It displays the breakdown of our impressions, clicks and ctr in different Google products (web and image) and in total. What I understand from this table is that, most of our increased impressions after restructure were on the image search section. I don't think users of search would be looking for content in our website. Furthermore, it shows that the drop in our web search ctr, is as dramatic of the overall ctr (-30% in compare to -60%) . I thought posting it here might help you understand the situation better. Is it possible that Google has tested our new structure for 25 days, and then decided to decrease our impressions because of the the new low CTR? Or should we look for another factor? If this is the case, how long does it usually take for Google to give us another chance? It has been one month since our impressions has dropped.

    Read the article

  • SEO for a list of products with filters

    - by dana
    I am a wondering if there is a recommended "best practice" for a product search SEO. I know to create a dynamic sitemap file that lists links to all products in the site. However, I want to implement a a bookmark-able "advanced search". Should I let search engines index any of the results? Take the following parameters for a search on a make believe used car website: minprice (minimum price in dollars) maxprice (maximum price in dollars) make (honda, audi, volvo) model (accord, A4, S40) minyear (minimum model year) maxyear (maximum model year) minmileage (minimum mileage) maxmileage (maximum mileage) Given these parameters, there could be an infinite number of search combinations: Price Between $10,000 and $20,000 /search?minprice=10000&maxprice&20000 Audis with less than 50k miles /search?model=audi&maxmileage=50000 More than 100,000 miles and less than $5,000 /search?minmileage=100000&maxprice=5000 etc. Over time, there may be inbound links to a variety of these types of searches, yet they are all slices of the same data. Should I allow for all of these searches to be indexed?

    Read the article

  • How to create the right loop / jquery / picture wipe? [migrated]

    - by Razor
    <script type="text/javascript" src="slider/jquery.touch-gallery-1.0.0.js"></script> <script type="text/javascript" src="slider/jquery.touch-gallery-1.0.0.min.js"></script> <script> $('body').live("click", function() { for (var i=0; i<4; i++) { alert("bla: "+i); } }); $('body').live("swipeleft", function(){ var nextpage = $("#page00002").next('section[data-role="page"]'); $.mobile.changePage(nextpage, 'slide'); }); $('body').live("swiperight", function(){ var prevpage = $("#page1").prev('section[data-role="page"]'); $.mobile.changePage(prevpage, 'slide'); }); </script>

    Read the article

  • CSS Intelligent Merger

    - by BHare
    I am looking for a tool very similar to http://www.tothepc.com/archives/combine-merge-multiple-css-files/ However, given this example: test1.css: #admin { background: #c9d2dc; border-color: #ccc } test2.css: #admin { background: #222; border-bottom: 1px solid #444; border-left: 1px solid #444; padding: 2px; position: fixed; right: 0px; top: 0px; width: 120px; z-index: 2 } It will only allow you to select one or the other. I want to merge them, making it: #admin { background: #c9d2dc; border-color: #ccc border-bottom: 1px solid #444; border-left: 1px solid #444; padding: 2px; position: fixed; right: 0px; top: 0px; width: 120px; z-index: 2 }

    Read the article

  • Best SEO practices for mobile URLs: 301, rel=canonical, or something else?

    - by Chris
    I am developing a site with a mobile version and am trying to figure the appropriate way to manage the URLs for search engines. So far I've considered: Having a separate mobile site (m.example.com) with rel="canonical" links to the regular site. Putting both the mobile site and full site on one URL (example.com), and doing user agent sniffing. Another opinion: Spencer: "If you have a mobile site at a separate location or URL, you should 301 redirect each and every mobile page to its corresponding page on your main website. Employ user agent detection so that the mobile optimized version is served up if someone's coming in from a hand-held. - http://developer.practicalecommerce.com/articles/1722-Mobile-site-Development-Best-Practices-for-SEO-Usability Both 2 and 3 make it hard for a user who wants to switch to the full site or mobile site manually, but I'm not sure 1 is the best alternative. What's the best way to write URLs for a mobile site?

    Read the article

  • How long should my Html Page Title Really be?

    - by RandomBen
    How long should my text within my <title></title> tags really be? I know Google cuts it off at some point but when? When I used IIS7's SEO Toolkit 1.0 I get error stating my title should be under 65 characters. I have a book by Bruce Clay that states I should use from 62-70 characters and roughly 9 +/- 3 words. I also have used SenSEO's Firefox Add-on and it states I should use a max of 65 characters or roughly 15 words. What is the max really? I have 2 sources saying 65 and 1 saying 72 but Bruce Clay is generally kept in high regard.

    Read the article

  • How to create a good sitemap for dynamic website

    - by Saif Bechan
    I have a website with dynamic content and different kind of pages. I have some pages that rarely change, and I have pages like blogs that change often. The blog pages also have links for sorting, for example sorting on date, asc, desc. On some of the pages I also have links to different tabbed content, and links that are just anchor links. Now when I use a xml sitemap generator then all the links are thrown into the site, and so I don't think all the links are really relevant. The blogposts up until now are also taken into the sitemap. Is this really necessary? I think the links to the blogposts can be indexed just fine. Is the best way to make a sitemap just to manually assign the main menu links to the sitemap, or is indexing everything really recommended?

    Read the article

  • Author Bio on all pages - Is it duplicate content?

    - by Rana Prathap
    In a website with user generated content, I provide a author bio under every article on the site. The author bio will be the same under every article the same author wrote. For some authors, the author bio is no longer then a couple of sentences, but for some descriptive writers, it is a good 100 words. These 100 words get repeated in almost 15 pages, some of them without substantial original content(such as haikus). Will this lead to duplicate content?

    Read the article

  • Google keeps indexing /comment/reply URL

    - by jaypabs
    With the new update of Google algorithm called Penguin, I think my site was being penalized due to webspam. But of course I don't create post which seems to be spam to Google. It is just I think how Google index my site. I found out that Google index the URL of my site like: http://www.example.com/comment/reply/3866/26556 So there are so many comment/reply URL index by Google. I have already added: Disallow: /comment/reply/ Disallow: /?q=comment/reply/ but still Google still index this URL. Any idea how to prevent Google from indexing comments?

    Read the article

  • Robots.txt and "Bad" Robots

    - by Lynda
    I understand robots.txt and its purpose. I have read some people saying that using a Robots.txt gives "bad" robots or robots who do not obey a robots.txt a way to access pages on your site that you do not want accessed. While I am not looking to get into a debate about that I do have a question: If I have a structure like this: /Folder/ /Sub-Folder 1/ /Sub-Folder 2/ (Note: There are no pages within /Folder/ only other folders.) If I Disallow: /Folder/ it will prevent "good" robots from accessing the directory and any contents within the sub-folders. While we know that bad robots will see the /Folder/ will they be able to see and acess the sub-folders and the pages within the subfolders if they are not listed in the robots.txt? (Note: I do not fully understand how robots good or bad crawl a site beyond using a robots.txt and links within the site.)

    Read the article

  • Filtering your offices IPs from Google Analytics when each has a dynamic IP?

    - by leeand00
    I found the documentation for filtering IPs from Google Analytics, but the address of the several locations of our company all have dynamic IP addresses that change every 30 days from what I'm told. I know from working with Dynamic DNS that the provider usually gives you a script that you configure your router to run when it's IP address changes or when it is restarted, which passes the new IP address to the DDNS server. I'm wondering if there might be a way to write or use a preexisting script to do the same thing with the Google Analytics API.

    Read the article

  • Static site generator with web-based file manager?

    - by user234
    I'm checking around options of static web site generators which led me to lots of articles about them! However, no word is spoken on how to edit files through a browser; it's always assumed you have either DropBox or some FTPish or terminal access. The only generator I could find that includes a browser based admin screen is Kirby (getkirby.com, mentioned at modernstatic.com) Besides the application above, what setup would you recommend to have both static HTML generation and web-based file management? Thanks!

    Read the article

< Previous Page | 106 107 108 109 110 111 112 113 114 115 116 117  | Next Page >