Search Results

Search found 9721 results on 389 pages for 'quicktest pro'.

Page 125/389 | < Previous Page | 121 122 123 124 125 126 127 128 129 130 131 132  | Next Page >

  • Admin form that generates an email confirmation ends up in SPAM [duplicate]

    - by PJD Creative
    This question already has an answer here: How can I prevent my mail from being classified as spam? 10 answers I have an admin form that I have setup for a client, that generates an email confirmation from a template I have designer... It works really well but it ends up in spam some of the time, and this is real frustrating as it is just confirming some details for the customer of what they have just booked, not at all spam, and is accessed via a page where the admin requires login. Any suggestions as to why this may end up in spam. It does have dollar signs ($$) as it is confirming a price, im assuming this is one problem, the rest of it is just general dates and info about the confirmation. Is there any suggestions on how to get this out of spam? thanks in advance

    Read the article

  • Google Analytics Funnel Step Regular Expression Not Working

    - by scoarescoare
    The first step in a funnel is going to have a dynamic ending fragment. Examples: http://mysite.com/invite/tickle-party http://mysite.com/invite/pajama-party http://mysite.com/invite/puppy-party To allow for such dynamism, I provided this url for step one: \invite(.*) My goals work but the funnel visualization report shows 0 for everything. I know this problem is due to the regex in the funnel step because I copied this entire goal except I replaced \invite(.*) with /invite/puppy-party When I hardcoded /invite/puppy-party the funnel worked as expected. Why is my funnel report not working with my original funnel step url parameter?

    Read the article

  • International search: how to show different domains in Google+ Local?

    - by Baumr
    Background A site has multiple ccTLDs: example.com for people in the US, example.co.uk for UK users, example.de for Germany, example.fr for France, etc. Searching for certain city keywords will return a list of Google+ Local (formerly Places): Each links to the corresponding company website that is visible. Problem When searching on www.google.de, the domain of the site intended for US users (example.com) appears instead of the corresponding ccTLD (example.de) aimed at German users. This applies to all languages. In my opinion and for the purposes of this business, it's not good user experience: searchers would most likely prefer to book on a site localized for them (e.g. in their language and currency). Question Is it possible to return different ccTLDs in these local search listings for users across the globe? Currently, Google+ Local seems to only support supports adding a single "Website" field. Solutions I have considered Creating duplicate Google Places listings for each URL would be spammy (and not viable when there's 100s of locations, each needing a listing in 8 languages). I don't see the hreflang annotation helping either, and GWMT geotargeting is already set.

    Read the article

  • How to prevent duplication of content on a page with too many filters?

    - by Vikas Gulati
    I have a webpage where a user can search for items based on around 6 filters. Currently I have the page implemented with one filter as the base filter (part of the url that would get indexed) and other filters in the form of hash urls (which won't get indexed). This way the duplication is less. Something like this example.com/filter1value-items#by-filter3-filter3value-filter2-filter2value Now as you may see, only one filter is within the reach of the search engine while the rest are hashed. This way I could have 6 pages. Now the problem is I expect users to use two filters as well at times while searching. As per my analysis using the Google Keyword Analyzer there are a fare bit of users that might use two filters in conjunction while searching. So how should I go about it? Having all the filters as part of the url would simply explode the number of pages and sticking to the current way wouldn't let me target those users. I was thinking of going with at max 2 base filters and rest as part of the hash url. But the only thing stopping me is that it would lead to duplication of content as per Google Webmaster Tool's suggestions on Url Structure.

    Read the article

  • Getting user generated content with no titles to rank

    - by hugo
    We are creating a site that allows users to generate content. The user is provided with a text field only (no title), similar to Twitter, Facebook, and Google+. Each piece of content created by the users will have a dedicated page/URL. Since the page has no title, I was wondering how search engines will index and display our pages. If the content was shared on other social networks, what will those results look like if there is no title for the open graph or Twitter tags?

    Read the article

  • Will using two different tracking codes affect my SERP

    - by Danny Hefer
    Hello everyone and thanks for your time! I am now facing a problem after a site migration. New site is basically an improved version of old site, with the same content and some extras. After pointing the domain name to the new site, the old site was still online for a while but didn't get any traffic. The new site has its own tracking code. So, old tracking code has age (something like 7 years) but no visitors for a month, but new tracking code is a month old with an acceptable traffic. How to you think google will react if I add old tracking code to new site? Thanks by advance!

    Read the article

  • How to create a good sitemap for dynamic website

    - by Saif Bechan
    I have a website with dynamic content and different kind of pages. I have some pages that rarely change, and I have pages like blogs that change often. The blog pages also have links for sorting, for example sorting on date, asc, desc. On some of the pages I also have links to different tabbed content, and links that are just anchor links. Now when I use a xml sitemap generator then all the links are thrown into the site, and so I don't think all the links are really relevant. The blogposts up until now are also taken into the sitemap. Is this really necessary? I think the links to the blogposts can be indexed just fine. Is the best way to make a sitemap just to manually assign the main menu links to the sitemap, or is indexing everything really recommended?

    Read the article

  • Is it good to buy multiple domains for competitive reasons?

    - by lowestofthekeys
    I am attempting to convince the higher ups at my company that spending $55 to renew one domain for a year is bad when they end up having 3-4 domains names for one website. They're reasoning for doing so is to keep these domains names out of the hands of the competition. For example, the company name is Pie Consulting & Engineering. They want to buy up pieforensicconsulting.com to keep it out of the hands of a competitor (we also do forensic engineering). Could a competitor use that domain in any kind of diabolical way? I mean I figure if someone is typing in pieforensiconsulting into the url field, they know what they're looking for and if it redirects to another company, they're not just going to stay on the site.

    Read the article

  • Domain files download upon opening

    - by Marian
    I'm having this wierd issue with my Domain. My domain is saoo.eu hosted on HostZilla. The issue is that whenever I open an html/php file it automatically downloads it instead of opening it into the browser. Example the saoo.eu/test.html page. Same thing happens with the index.html file. What is going on? Also if I want an PHP code ran into an HTML file I have to add an .htaccess file. But it doesn't seem to work. Tested it before.

    Read the article

  • Breadcrumb for multiple categories

    - by Damodar Bashyal
    I post in multiple categories, so is it better to have: Consulting Services Implementation Service A Consulting Services Optimization Service A Consulting Services Upgrade Service A or, Consulting Services Implementation, Optimization, Upgrade Service A I was doing second way, the problem is google doesn't show 3rd set of crumbs. ie it only displays: Consulting Services on search result. But having multiple breadcrumbs on the page doesn't look good. any suugestions? Update For @PatomaS 's question I mean 3 lines of breadcrumbs, see above i have posted same article (Service A) in 3 categories (Implementation, Optimization, Upgrade). So you can reach same article through 3 categories. So whats the best breadcrumb to display on article 'Service A'?

    Read the article

  • last-modified/etags - to include or not?

    - by Kae Verens
    Google's PageSpeed plugin suggests that a website should include Last-Modified and ETag headers: Specify a cache validator "Resources that do not specify a cache validator cannot be refreshed efficiently. Specify a Last-Modified or ETag header to enable cache validation" However, Apache suggests that by not including them at all, we speed up websites by eliminating If-Modified-Since and If-None-Match requests: http://www.askapache.com/htaccess/apache-speed-last-modified.html these are in direct opposition - which should be implemented? I'm leaning towards Apache's suggestion, as when I want a file cached, I don't want it refreshed.

    Read the article

  • Web Development Environment: How to distribute edited hosts files over bunch of mac machines?

    - by Alex Reds
    I am doing some research to prepare some web development environment for our small(10ppl and growing) new office. User Case: For each new web project usually we create new alias on an Apache server someproject.companywebsite From my understanding in order to see this website locally for all the rest of our team(including mangers and directors) they will need to edit hosts file (e.g. "192.168.1.10 someproject.companywebsite"), and like that each time for a new project(can be 2-5 each week) Solution: And I looking for a solution how to edit this hosts file only once and distribute it over all mac machines in our network at once or much more flawlessly than poking around with each machine every time over and over again. Is that possible? Or that a very wrong way of doing that? Perhaps we better set up own local dns server and point to it our router? Though own dns server a bit concerns me because of might be some network interruption and others lags, if you know what I mean. Or perhaps there are another workflows for that? What's the best way for such things? So I'll be so grateful to hear some advices from experienced admins. I couldn't find that info on internet, so if you know where to read about it, point me in a right direction. Thank you in advance Alex

    Read the article

  • Do you know a good html mailing list management software with admin levels?

    - by SirG
    I'm basically looking for a program/app/script (can be commercial) which I can ideally install on a windows server (we can run asp, asp.net php mssql) we have different groups of people who send newsletters to web members, I want to bring it all into one app which I can monitor and control. Ideally it would be able to create html newsletters, (with some templates) track emails and click throughs. Manage email lists subscribe/unsubscribes. And importantly have different levels of admin, so a newsletter creator could log in and create and send off an email, it goes into a queue where a communications editor can have an overview of all newsletters and approve the sending of the emails or edit them before they are sent off. before I start coding something up myself I thought I'd ask if anyone has any advice! Cheers!

    Read the article

  • Google affecting my SERP Rank?

    - by Asad Moeen
    The following are some of my website's details. Home-page: [thebluewaffles].[com] Keywords: Blue Waffles- Rest of the keywords are post/subject specific. Site Description: Health Articles Blog Site Age: 1.5 years A short history: When I started my website, the few things in my mind when posting content were at-least 500 words on each page and writing of all the articles with to the point information. I didn't go really fast with it which is why I only have about 15 articles in 1.5 years. The SEO strategy was more simple. I shared links through Social Marketing websites and some Article Sharing websites after which I could see my website's rankings in top 5 SERP results. I ranked good enough for about 8 months continuously but didn't keep updating content due to which there were some 3 rough months when no content was posted due to some personal work. The SERPS dropped to 2nd page in April and almost started disappearing in May. I asked a lot of people about it and most came up with the reason of "no updates to site" so I started updating my site again since the day, November has almost started and I see no signs of my website's ranking. Another important point is that when I post a new article, and do a title search in Google, I see it ranks good enough for the first 10 hours and then disappears. What could be wrong here?

    Read the article

  • How can I allow robots access to my sitemap, but prevent casual users from accessing it?

    - by morpheous
    I am storing my sitemaps in my web folder. I want web crawlers (Googlebot etc) to be able to access the file, but I dont necessarily want all and sundry to have access to it. For example, this site (superuser.com), has a site index - as specified by its robots.txt file (http://superuser.com/robots.txt). However, when you type http://superuser.com/sitemap.xml, you are directed to a 404 page. How can I implement the same thing on my website? I am running a LAMP website, also I am using a sitemap index file (so I have multiple site maps for the site). I would like to use the same mechanism to make them unavailable via a browser, as described above.

    Read the article

  • Easiest solution to setup payments for a conference registration page?

    - by Keith G
    I've got a fair amount of website development experience, but I've been asked to setup a conference registration page in short order. However, I have absolutely zero experience with shopping carts, payment processing, etc. What is the absolutely quickest and easiest way to get this thing up and running? Here are my criteria: Site is currently hosted on Godaddy.com and someone has suggested using their QuickCart We cannot use any option that visits the paypal.com domain because it has been blocked my a large segment of the potential audience (on a military base). Need a $0 option for speakers Cancellations can be accepted, so maybe something that could handle that would be a bonus There is no "product" other than a confirmation that they have registered for the conference.

    Read the article

  • Safety of purchasing country-specific domains from registrars?

    - by Marc Bollinger
    None of the previous questions tackle some of the one-off (or further) countries' registries, beyond .co.uk, .it, et al. or else I'd have found an answer myself. Is it safe to buy a domain from a foreign country TLD from a registrar? http://www.iana.org/domains/root/db/ I'm just looking for information for a vanity domain, so obviously I'm alright without an answer, but it's an unasked question (or at least, unanswered), and I'm not exactly in a hurry to give my credit card information over country lines, sight unseen.

    Read the article

  • Affiliate software to attract incoming customers

    - by Steve
    I am close to starting a new website for a small business which imports products from USA to Australia. The wholesaler says he will allow my client to be the sole distributor for Australia & New Zealand. I'm not sure what CMS or shopping cart software to use yet, but it will need to include an affiliate system to allow advertisers to push customers our way. Do you have any suggestions for robust, flexible affiliate software? Thanks.

    Read the article

  • Shared to Dedicated or Amazon CloudFront to improve performances and keep secured?

    - by user978548
    I have a Wordpress which currently takes about 1.8s to 2.5s for the home page to completely load in my country. The page weight is about 700Ko (static content included). In order to increase performances, I'm considering two solutions: Switching to a dedicated host. Using amazon s3 cloudfront to serve static contents. My current shared hosting have servers in a neighboring country but not exactly in mine, and both amazon and the dedicated hosting have some, so that's already an advantage. So considering all that, I still have three questions remaining: Currently having a low traffic (100 unique visitors/days, but growing) will it make a huge difference between my shared hosting and a dedicated server ? Knowing that I already use a cookie-less domain to deliver static contents (but using a redirection to the same server), would using amazon s3 make a real difference ? Talking about the cons of dedicated vs amazon s3, if I choose for the dedicated server something like Ubuntu server and do daily package updates and have only port 80 open, would it be sufficient in terms of security (in comparison with my current shared hosting which manage everything for me) ?

    Read the article

  • "Search Friendly" domain names

    - by Ben
    We bought a few search friendly domain names for the CPA site that I manage. Each of the domains we bought has the name of a nearby city and the word cpa in front of, or behind the city name. The plan is to create a landing page for each of these domains with useful information about business filings, ect. specific to that city, as well as directions to our office from that city. The question is how to best utilize these new domains: Should each domain be set to a 301 redirect to mainsite.com/city ? Should each domain be it's own single page mini-site that links to mainsite.com ? What other options are there and what are the pros/cons? Remember the goal is to be more relevant in searches that use a nearby city name in their search for CPA/accounting services.

    Read the article

  • n00b needs some PHP syntax guidance [closed]

    - by Michael
    If you look at http://www.cruc.es/?paged=12/ and go to the bottom of the page you'll see the bottom navigation with the next and previous options. I've been able to make the page numbers work by changing page to paged= in the code. I don't know enough about PHP to get the previous/next options to work. Any advice would be appreciated and I've pasted the code below. Thank you: n00b if ( $query->found_posts > $query->query_vars["posts_per_page"] ) { echo '<ul class="paging">'; // Previous link? if ( $page > 1 ) { echo '<li class="previous"><a href="'.$baseURL.'/page/'.($page-1).'/'.$qs.'">previous</a></li>'; } // Loop through pages for ( $i=1; $i <= $query->max_num_pages; $i++ ) { // Current page or linked page? if ( $i == $page ) { echo '<li class="active">'.$i.'</li>'; } else { echo '<li><a href="'.$baseURL.'/?paged='.$i.'/'.$qs.'">'.$i.'</a></li>'; } } // Next link? if ( $page < $query->max_num_pages ) { echo '<li><a href="'.$baseURL.'/page/'.($page+1).'/'.$qs.'">next</a></li>'; } echo '</ul>'; }

    Read the article

  • Why get dedicated hosting? [closed]

    - by user176105
    Possible Duplicate: How to find web hosting that meets my requirements? I just finished writing a website and I'm about to publish it. I was looking at hosting options and I was about to get regular hosting from godaddy, which is about $6 a month with unlimited bandwidth, 150 gb of data, 500 emails and 25 mysql databases. The other option is dedicated servers, which range a lot in price, but are around $200 a month. Why would someone choose dedicated servers? Is it becuase they max the limits of regular hosting or is it because the ram/cpu is shared on regular hosting? If the latter, what will happen if a lot of users come to my site and max the ram/cpu?

    Read the article

  • adding tagged / dynamic pages in sitemap

    - by sam
    ive got a blog thats been running for about a year ive made about 200 posts, and there should be about 220 pages to index (additional pages for about / contact ect). When i go to crawl the site i get 1900 pages because of all the pages that are related to tags ive used in my blogs these 70% of these pages only contain one blog post. When submitting my site map to google should i exclude all pages with /tagged/ in the url so ill only be submitting unqiue pages, or should i submit the full site map ?

    Read the article

  • Multilingual website without language component in the URL

    - by user359650
    I'm working on a website for Canada which will have French and English versions. For SEO purposes, I would like to avoid using any language tag in URLs because I believe it will have more impact (e.g. example.ca/products better than en.example.ca/products or example.ca/en/products). I believe this is technically possible because the2 languages are sufficiently different that the URLs won't be conflicting with one another (e.g. if you want a "product" page, it will be /products in English, and /produits in French so you know which language the URL is about). Since Google (and most likely others) doesn't rely on the URL (nor HTML tags) to determine the content language I don't see any problems with search engines. To make this possible I've thought about using a cookie distinct from the session cookie (e.g. example.org_language) with long term expiry (e.g. N years) that will memorize the language chosen by the user. That way when people visit the website with a new browser session, they get served the proper language. I have already given up on users being able to switch one page from English to French: when people will chose English or French from the menu they will be redirected to the corresponding version of the home page. Do you foresee any problems with not using a language component in the URL (whether domain or path)? (as long as one makes sure URLS don't conflict).

    Read the article

  • When Googlebot sees a link, will it click it or navigate to it?

    - by FakeRainBrigand
    My site uses pushState and JSON data to display content. So, for example, this might appear on my page: <a href="/some/page">some page</a> The JavaScript then prevents the default action (following the link), and instead renders a view (using a different api, such as /getjson?some_page). $('[href]').click(function(){ history.pushState(...); handleURL(...); }); Assume my server will respond to requests at /some/page with a pre-rendered version. My questions are: will Googlebot receive the prerendered version, or allow JavaScript to instead invoke pushState, etc. if it doesn't make the direct request, will it wait for AJAX content to be loaded? does Googlebot implement pushState, so it will show the proper URL in search results?

    Read the article

< Previous Page | 121 122 123 124 125 126 127 128 129 130 131 132  | Next Page >