Search Results

Search found 13195 results on 528 pages for 'technical trainer pro'.

Page 225/528 | < Previous Page | 221 222 223 224 225 226 227 228 229 230 231 232  | Next Page >

  • Should I Use WordPress Category Archives or Regular Pages When Considering SEO?

    - by user1151640
    I've built a WordPress site based on posts and category archives (no pages). The menu redirects to different category archive pages that have a description, an image, and the relevant posts. Now that almost everything is finished I've started to worry and wonder if that was a good decision from an SEO standpoint Will Google consider category archives a bad idea for sitelinks compared to using regular pages?

    Read the article

  • How to tell google that i have changed my website urls ?

    - by Momen M El Zalabany
    I have done major updates in my website, and renamed all my urls. Problem is how can i tell google to i have renamed all urls and let google refresh his library ? I have uploaded sitemap vai google webmaster tools many times. My Website url : http://www.pndmasr.com My sitemap http://www.pndmasr.com/sitemap.xml but still every time i search google for "pndmasr" i get old pages results, i have waited more than 3 days by yet same problem. any solutions ? is there a problem with my sitemap ?

    Read the article

  • How to change my website's appearance in a Facebook wall post?

    - by Lode
    When posting a website link in a Facebook wall post, Facebook fetches some content (title, text and image) from the website to show it to readers. Is there a way I can adjust / propose which content is used / preferred by Facebook? I found someone saying to use <meta property="og:image" content="image.jpg">, but this doesn't seem to have any effect. But maybe Facebook caches these results for a while?

    Read the article

  • auth_mysql and php [migrated]

    - by user1052448
    I have a directory with auth_mysql in a virtualhost file password protected using a mysql user/pass combo. The problem I have is one file inside that directory needs to be accessed without a user/pass. Is there a way I can pass the user/pass within the php file? Or excluse the one file? What would I put between the code below? <Location /password-protected> ...mysql password protection require valid-user </Location>

    Read the article

  • SMS ad service for a php app.

    - by BandonRandon
    I am about to launch a website that allows the end user to get alerted as their bus approches the stop. The only problem was this is a little expensive as I'm paying about 5 cents per alert. I was thinking that if i added a short message something like "brought to you by acme co" at the end of the message I may be able to recoup cost without making my service a paid one. Anyone know of any SMS ad companies or how one would go about finding one. Google has failed me, probably searching for the wrong thing.

    Read the article

  • Website hacked, cpanel password not encrypted?

    - by Jeg Bagus
    Yesterday I found out that all the websites that are hosted on my webhosting site were hacked. I tried to change my password, and unbelievably, I COULD SEE my password there. This means my password is able to be decrypted. I asked customer support, and they say, its normal, CPanel saves the password like that. Is it true? I mean, cpanel password able to be decrypted? They blame me because my wordpress version is out of date. But 2 of my website are on different platforms. One made by CI and one by Wordpress, and all are hacked. Is this hosting reliable?

    Read the article

  • URL is generating a /#!/splash-page

    - by user32642
    My site for some reason is generating a shebang - /#!/splash-page on the URL. For example when I type www.modernvintage1005.com, the browser returns www.modernvintage1005.com/#!/splash-page and every subsequent page is /#!/about, /#!/contact, and so forth. There's absolutely nothing on the Google about this. There is a lot of rewrite help to eliminate .index.php from the home page, but that's it. How do I rewrite it to just say domain.com and domain.com/about.html, etc.? Here is my .htaccess file if you need to see it. # Rewrite Rule <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule> # compress text, html, javascript, css, xml: <IfModule mod_deflate.c> AddOutputFilterByType DEFLATE text/plain AddOutputFilterByType DEFLATE text/html AddOutputFilterByType DEFLATE text/xml AddOutputFilterByType DEFLATE text/css AddOutputFilterByType DEFLATE application/xml AddOutputFilterByType DEFLATE application/xhtml+xml AddOutputFilterByType DEFLATE application/rss+xml AddOutputFilterByType DEFLATE application/javascript AddOutputFilterByType DEFLATE application/x-javascript AddType x-font/otf .otf AddType x-font/ttf .ttf AddType x-font/eot .eot AddType x-font/woff .woff AddType image/x-icon .ico AddType image/png .png </IfModule> ## EXPIRES CACHING ## <IfModule mod_expires.c> ExpiresActive On ExpiresByType image/jpg "access 1 year" ExpiresByType image/jpeg "access 1 year" ExpiresByType image/gif "access 1 year" ExpiresByType image/png "access 1 year" ExpiresByType text/css "access 1 month" ExpiresByType application/pdf "access 1 month" ExpiresByType text/x-javascript "access 1 month" ExpiresByType application/x-shockwave-flash "access 1 month" ExpiresByType image/x-icon "access 1 year" ExpiresDefault "access 2 days" </IfModule> ## EXPIRES CACHING ##

    Read the article

  • Ad networks that will serve via HTTPS?

    - by Dogweather
    I've built a website with 160K page views per month that serves every page over HTTPS. The recent FireSheep news will probably increase the adoption of "HTTPS everywhere" but it's been very hard to find ad networks and affiliates that will serve their content via HTTPS. I don't want to use these because I don't want my visitors to get "broken security" notification from their browsers (and of course, relevant ads would be a leak of private information). I'm tired of spending a ton of time signing up with ad networks and affiliates only to find out down the road that they don't support HTTPS (e.g. AdSense). Can anyone suggest any options or provide a pointer to a list of these somewhere?

    Read the article

  • understanding technology that news websites use

    - by Registered User
    I am trying to understand the technology which many news websites use please have a look at this website http://www.shritimes.com/ if you click any news item then the particular thing gets zoomed and viewer can read the news as far as I can understand they have done some programming by which a gif image is opened up in a new pop up, can some one help to understand as what thing is used here javascript,html, php or what exactly? I have seen this feature in a lot of websites I want to know how it is achieved? I am looking from the code side of the things if some one can advise me any function call which does this I am a programmer but into C I am new to web kind of things.

    Read the article

  • Google is displaying "Translate this page" based on a previously registered domain inbound links

    - by crnm
    I recently started a new project with a newly registered generic tld domain. As soon as Google started indexing the page, it displayed a "translate this page" in SERP's, which tries to translate the page to the language of a small Eastern European country from the language that the site actually uses. I tried everything to prevent this: language meta headers and attributes, localisation through Google Webmaster Tools...all to no avail - nothing helped. After a couple of weeks I spotted dozens of inbound links popping up in Google Webmaster Tools all coming from that small Eastern European country, from sub-pages that are not active anymore (either sending out 404's or 301's to the main page), and also had been written in that other language. So the domain had been registered before and as it looks, it did got a lot of possibly spam links in that language. I can't even ask the sites where those links should have been to remove them as they are not active anymore physically, just in Google Webmaster Tools and/or internal data masses... Now I'm at a loss about what to do? As my site is pretty new, it does not have many links pointing towards it in my targeted language. So those are probably not enough to convince Google of attaching the right language to it as Google ignores all other signals about the page language. I'm also unsure if I should use the "disavow" tool, or a reconsideration request...or what else to do about this miserable state. I never used these tools before so I don't have any experience with them. Somehow I have to convince Google about the right language of the page and also to not count/apply/whatever all those historical links from the previous owner. (The domain had been deleted without any traces in Google before I registered it) Has anyone here ever dealt with a similar "Translate this page" problem? (I've also looked at this thread: How can I prevent Google mistakenly offering to translate a page? but didn't find a solution there)

    Read the article

  • How does URL Rewriting affect SEO?

    - by Costa
    The following paragraph is from SEO Google Guide Google is good at crawling all types of URL structures, even if they're quite complex, but spending the time to make your URLs as simple as possible for both users and search engines can help. Some webmasters try to achieve this by rewriting their dynamic URLs to static ones; while Google is fine with this, we'd like to note that this is an advanced procedure and if done incorrectly, could cause crawling issues with your site. What makes URL re-writing implementation incorrect for GoogleBot? I am using Asp.net 3.5 framework.

    Read the article

  • Best in-depth analytics or stats tools? (preferrably server-side)

    - by Litso
    Hey all, I know there's been questions about this before, but mine is a little more specific. I work for a high traffic website and we want to start tracking our visitors better. Unfortunately, Google Analytics is not an option at the moment, so what I'm looking for is some alternatives, preferrably server-side (but not necessarily). We're currently running Urchin, but what I'm missing most there is the way you can set conversions in Analytics and then track (for example) which keywords convert better or which landing pages convert better. Also, A/B testing is something I really miss. Which analytics tools can be compared to analytics in terms of advanced segmentation, navigation summaries, A/B testing, etc?

    Read the article

  • duplicate pages

    - by Mert
    I did a small coding mistake and google indexed my site wrongly. this is correct form: https://www.foo.com/urunler/171/TENGA-CUP-DOUBLE-HOLE but google index my site like this : https://www.foo.com/urunler/171/cart.aspx first I fixed the problem and made a site map and only correct link in it. now I checked webmaster tools and I see this; Total indexed 513 Not selected 544 Blocked by robots 0 so I think this can be caused by double indexes and they looks not selected makes my data not selected. I want to know how to fix this "https://www.foo.com/urunler/171/cart.aspx" links. should I fix in code or should I connect to google to reindex my site. If I should redirect wrong/duplicate links to correct ones, what the way should be? thanks for your time in advance.

    Read the article

  • Which Adult Ads Service is best / highest paying [closed]

    - by shamittomar
    I have a sex education & sexual health website. As evident, I can not place Google Adsense and Adbrite advertisements as they disallow mature content and even remotely anything related to it. Now, I want to know what are the other options I have for showing up ads. I do NOT want to place very obscene and nude ads. But, I would like to have some kind of ads on website to make it sustainable. So, what options do I have ? Which adult advertisement publisher gives highest payouts ?

    Read the article

  • How do I migrate web files from a Plesk 8 installation (on a slaved HDD) to a Plesk 10.4.4 installation?

    - by Ranger
    Due to Plesk 8 being at End of Support our host setup a new installation of RHEL and Plesk 10 on a new hard drive. They then slaved the old drive to the new so that we could migrate all our files using SSH. I am having challenges correctly migrating the sub domain files. The path to subdomain root folder in Plesk 10.4.4 is confusing as I don't know where to copy the files to. The path to the files on the slaved drive is: "/mnt/old-drive/var/www/vhosts/domain_name.com/subdomains/SUBDOMAIN_NAME/", meanwhile on the new installation I have "/var/www/vhosts/SUBDOMAIN_NAME.domain_name.com". There is an httpdocs folder in the '/var/www/vhosts/domain_name' folder but none in the '/var/www/vhosts/SUBDOMAIN_NAME.domain_name.com' folder. Where do I copy my subdomain files to? Please help.

    Read the article

  • Do I need multiple accounts in Facebook for each of my product site?

    - by John
    I've a dozen sites which include for-profit ones as well as for charity. For each site I've created a Facebook company/charity account. After creating those accounts it dawned on me that I could as well have created a new page for each of my site from my personal account only even if a site has multiple product pages. What'll be the right strategy? Also as per Facebook terms we can have only single personal account. I do have single personal account only but for each site I've created only company pages. I hope I'm not violating the facebook terms.

    Read the article

  • Pulling My Hair Out - PHP Forms [migrated]

    - by Joe Turner
    Hello and good morning to all. This is my second post on this subject because the first time, things still didn't work and I have now literally been trying to solve this for about 4/5 days straight... I have a file, called 'edit.php', in this file is a form; <?php $company = $_POST["company"]; $phone = $_POST["phone"]; $colour = $_POST["colour"]; $email = $_POST["email"]; $website = $_POST["website"]; $video = $_POST["video"]; $image = $_POST["image"]; $extension = $_POST["extension"]; ?> <form method="post" action="generate.php"><br> <input type="text" name="company" placeholder="Company Name" /><br> <input type="text" name="slogan" placeholder="Slogan" /><br> <input class="color {required:false}" name="colour" placeholder="Company Colour"><br> <input type="text" name="phone" placeholder="Phone Number" /><br> <input type="text" name="email" placeholder="Email Address" /><br> <input type="text" name="website" placeholder="Full Website - Include http://" /><br> <input type="text" name="video" placeholder="Video URL" /><br> <input type="submit" value="Generate QuickLinks" style="background:url(images/submit.png) repeat-x; color:#FFF"/> </form> Then, when the form is submitted, it creates a file using the variables that have been input. The fields that have been filled in go on to become links, I need to be able to say 'if a field is left blank, then put 'XXX' in as a default value'. Does anyone have any ideas? I really think I have tried everything. I'll put below a snippet from the .php file that generates the links... <?php $File = "includes/details.php"; $Handle = fopen($File, 'w'); ?> <?php $File = "includes/details.php"; $Handle = fopen($File, 'w'); $Data = "<div id='logo'> <img width='270px' src='images/logo.png'/img> <h1 style='color:#$_POST[colour]'>$_POST[company]</h1> <h2>$_POST[slogan]</h2> </div> <ul> <li><a class='full-width button' href='tel:$_POST[phone]'>Phone Us</a></li> <li><a class='full-width button' href='mailto:$_POST[email]'>Email Us</a></li> <li><a class='full-width button' href='$_POST[website]'>View Full Website</a></li> <li><a class='full-width button' href='$_POST[video]'>Watch Us</a></li> </ul> \n"; I really do look forward to any response...

    Read the article

  • URL parameter names being changed by user agents

    - by Mike Deck
    In reviewing one of our site's web logs I'm seeing instances where we are returning a 404 to requests because we're expecting an id parameter to be sent, but instead we're seeing a di parameter. The resource in question is an image but which image file actually gets served is dependent on the id parameter. The expected url is something like http://images.mysite.com/photo.gif?id=123&width=200&height=300 What I'm seeing in the logs is requests for http://images.mysite.com/photo.gif?di=123&width=200&height=300 The only case where we are seeing this on the id parameter. It seems unlikely that this is due to a server side or JavaScript bug since it seems to be only effecting a small percentage of our traffic. We are seeing this across a wide variety of user agents (both mobile and desktop) and IPs. Has anyone else seen this? Is there a browser plugin or other software you're aware of that could be causing this, and if so is there a good way to work around the issue?

    Read the article

  • Redirecting 2 or more domains to same hosting server

    - by mtk
    I have domains A.com, A.co.in and A.in Purchased from site X. I have a hosting space/account purchased from site Y, which has provided me with 2 DNS entries that is to be replaced in the account at the site from where I purchase the domains. I have successfully changed the DNS entries of A.com to these 2 DNS entries and I am able to see my index.html page when I hit A.com. Problem On similar lines, I have changed the DNS entries to the same entries for A.co.in and A.in, but on hitting those sites in browser gives me no response and browser specific page of 'Site not found' is been seen. Please let me know, how to set this, so that when I hit any of the domain, the web-site is rendered from the hosting server? What am I doing wrong here? Note It has been more than 3 days after changing the DNS entries, so I don't think so this is a problem of DNS propagation, which I heard from some people. Please provide some detail explanation, as I am very very new to this. This is my first hosting ;) -Thanks

    Read the article

  • How do I access column data in a previous select statement from a sub-query? [closed]

    - by payling
    PROBLEM How do I access column data in a previous select statement from a sub-query? Below is a simple mock up of what I'm attempting to do. Tables used: Quotes, Users QUOTES TABLE qid, (quote id) owner_uid, creator_uid SQL SYNTAX: SELECT q.qid, q.owner_uid, q.creator_uid, owner.fname, owner.lname FROM quotes q, (SELECT u.fname, u.lname FROM users u WHERE u.uid = q.owner_uid) AS owner WHERE q.qid = '#' SUMMARY I want to be able to use the quote table's owner_uid and specify it for the owner table so I can return all the owner info for that particular quote. The problem is, q.owner_uid is not recognized in the owner sub-query. What am I doing wrong?

    Read the article

  • How can I choose between Linux and Windows hosting? [closed]

    - by Mohamad
    Possible Duplicate: How to find web hosting that meets my requirements? I am a relative beginner when it comes to choosing web servers and hosting plans. I'm about to signup for a hosting plan with GoDaddy. My main requirement is ColdFusion and MySQL. The plans on offer include Linux and Windows based plans. Which one should I choose, and why? I don't have a lot of requirements other than what I mentioned above. I never used Linux before but I doubt I'll ever need to do anything beyond tampering with my account. What are the main advantages of one over the other?

    Read the article

  • Working with Primary Keys and Generators - Quickstart with NHibernate (Part 4)

    - by BobPalmer
    In this NHibernate tutorial, I'll be digging into the ID tag and Generator classes.  I had originally planned on finishing up a series on relationships (parent/child, etc.) but felt this would be an interesting topic for folks, and I also wanted to start integrating some of the current NHibernate reference. Since this article also includes some reference sections (and since I have not had a chance to check for every possible parameter value), I used the current reference as a baseline, and would welcome any feedback or technical updates that I can incorporate. You can find the entire article up on Google Docs at this link: http://docs.google.com/Doc?id=dg3z7qxv_24f3ch2rf7 As always, feedback, suggestions, and technical corrections are greatly appreciated! Enjoy! - Bob

    Read the article

  • would it be bad to put <span> tags within the <head>, for grouping meta data in schema.org format?

    - by hdavis84
    Alright, I'm currently practicing schema.org microdata, and trying to find the best route for every site I build. I have found that i can piggyback itemprops on open graph meta tags. I would like to piggyback more itemprops on opengraph meta tags. However, schema.org requires you to change itemtypes to define all aspects of a "thing". Say I'm defining a LocalBusiness. Open graph has street address, locality, and region i'd like to piggyback on. I'd have to do something like: <html lang="en" itemscope itemtype="http://schema.org/LocalBusiness"> <head> ... <meta itemprop="name" content="Business Name" /> <meta property="og:url" itemprop="url" content="http://example.com" /> <meta property="og:image" itemprop="image" content="http://example.com/logo.png" /> <span itemprop="address" itemscope itemtype="http://schema.org/PostalAddress"> <meta property="og:street-address" itemprop="streetAddress" content="1234 Amazing Rd." /> <meta property="og:locality" itemprop="addressLocality" content="Greenfield" /> <meta property="og:region" itemprop="addressRegion" content="IN" /> </span> </head> Although there's more that can be added in, this is enough of an example to show what I'm trying to achieve. I've searched the web to see if it is an issue to use spans in the head or not, because I don't want invalid markup. I know I can mark up the address information in the body of the pages, but the route above would be more efficient. Does anyone have an answer for this?

    Read the article

  • What could have caused a large traffic drop from Google in early May?

    - by Scott Schluer
    I have a website (www.equispot.com) that has been indexed for almost 2 years in Google. I managed to get myself on the first page (average position 6-8) on Google for my target keyword of "horses for sale" and held there pretty solidly for months. Suddenly, with no changes to the site, traffic from Google dropped like a rock in early May. I slowly fell in position until now I'm sitting at the bottom of page 4. I have never hired an SEO firm, have not used any "black hat" techniques that Google would have penalized me for in their May update, etc. I'm not familiar enough with SEO to know how to look at link profiles, etc. to tell if there's something wrong. I've run my site through a DNS checker and it came back with no errors. Google Webmaster Tools shows no messages or notices of any kind, just a drop in traffic. GWT also shows only 2 server errors and 1 404. Is there anyone who can tell me by quickly checking my domain if there's an obvious reason that my traffic would have fallen so far, something that I can fix?

    Read the article

  • Website Registration Date

    - by Matt Walker
    I recently registered a website : cinematrailers.net. I was aware that this domain was expired ( registered in 2007 until 2011) My question is when I go to view the registration date on this domain, it says 2012 instead of 2007. Why is that when it was clearly registered in 2007? And will this affect seo? I was under the assumption that the domains age would be great for seo, but now I'm not sure.

    Read the article

< Previous Page | 221 222 223 224 225 226 227 228 229 230 231 232  | Next Page >