Search Results

Search found 13195 results on 528 pages for 'technical trainer pro'.

Page 233/528 | < Previous Page | 229 230 231 232 233 234 235 236 237 238 239 240  | Next Page >

  • Compared to Firefox 4 and Google Chrome 10, what can't IE9 do?

    - by ClosureCowboy
    If a website works in Firefox 4 and in Google Chrome 10, what could potentially cause that website not to work (broken layout or broken JavaScript) in IE9? What limitations and differences does IE9 have, aside from vendor-specific stylesheet rules? Yes, that is a painfully vague question — that's because I am not asking this question from the perspective of someone with a specific problem! I'm asking this question from the perspective of someone with a working website who does not have access to IE9.

    Read the article

  • Blog: Search results prefer index page over content pages.

    - by jonescb
    I have a typical blog that has recent posts on the main page, and each post's title links to a page that only shows that one article and comments and such. I was looking through some of the keywords used to get to my site and I was noticing that some of the searches would only show my main page, and not the page for the article. If users have to find the article by scrolling through the main page, it just makes it more difficult. Is there some way that I can tell search engines to rank the content page higher than my index page? Or can I do something else like not display the full text of the posts on the main page?

    Read the article

  • Hosting and domain registrations for multiple clients

    - by letseatfood
    I am finally getting regular work desiging, developing, and deploying websites for small businesses and individuals. So far the websites utilize single-user content management systems, so the websites create, as far as I know, minimal load on the shared servers. I have always required that each of my clients purchase annual shared hosting at Dreamhost. For domain registration, I ask that they register with Dreamhost, but some already have a registered domain elsewhere and this is fine with me. I do this so the billing issues are the client's responsibility, not mine. My question is: Since I can register unlimited domains and connect them to my one shared hosting account at Dreamhost, should I not be requiring clients to individually pay for shared hosting and a domain? Should I actually be paying for one hosting account and then hosting all of my client's websites on that account? As I said before, I currently have each client buy their own hosting, because I feel that, for example, if there is high traffic to their site, there would be less a chance of the site going down than if their site was hosted with many others on one account. I am famous for being long-winded, please let me know if I can clarify at all. Thanks!

    Read the article

  • Google analytics - vistor path to specific site destination setup and monitoring?

    - by Joshc
    I have a website which I am using google analytics to track visitors and track our banner campaigns. We're are promoting 'Purchase Ticket' buttons on our website which push visitors to a third party website who sell and distribute our tickets. The url on all the 'Purchase Ticket' buttons are the same through out the site... Example: http://ticketmaestro.com/events/my-event-2012 In the analytic control panel, is it possible so set something up, where I create a path-to-destination using the above example url? ...and then after this is setup: I want to be able to monitor the path visitors are taking from when they reach the site - to when they click the 'Purchase Ticket' button. Graphs will show... Start Destination Path to Final Destination Final Destination: http://ticketmaestro.com/events/my-event-2012 Any help, suggestions, terminology would be great thanks. Josh

    Read the article

  • Is there a way I can sort traffic by page-type based upon URL structure in Google-Analytics or Google Webmaster Tools??

    - by Felix
    I have a local business directory site. I'm trying to segment my incoming traffic by page-type such that i can find out what percentage of traffic is going to zip code pages exclusively and what percentage is going to city/state level pages. I basically want to filter by URL structure to find out what percentage of total traffic zip code pages account for. The reason for doing this is to find out if Does Google Tag Manager help with this? Here are the two URL paths: http://www.example.com/ny/new-york/10011/ http://www.example.com/ny/new-york Thanks all!

    Read the article

  • Google Analytics - bad experiences? (esp. adult content)

    - by Litso
    I work for a rather large adult website, and we're currently not using Google Analytics. There is an internal debate going on about whether we should start using Analytics, but there is hestitation from certain parties. The main argument is that they fear that Google will get too much insight into our website, and might even block us from the index as a result based on our adult content. Has anyone here ever had such an experience, or know stories about bad experiences with Google Analytics in such a manner? I personally think it will only improve our website if we were able to use Analytics, but the dev team was asked to look into possible negative effects. Any help would be appreciated.

    Read the article

  • I am building a simply website for my mobile app & need good recommendation on where to host it [closed]

    - by Gob00st
    Possible Duplicate: How to find web hosting that meets my requirements? 1Question 1 I am building a simply website for my mobile app & need good recommendation on where to host it ? I am not expectation a large access volume any time soon but I want stability in general & considering I am just starting to do my 1st app, so I probably need it to be relative cheap. Please recommend me some stale & cost effective web hosting service ? 2Question 2 Also since I am some what new to web development (know basic HMTL & have used front page/dream weaver like 10 years ago, but haven't touched it for ages). But I am a good c++ software developer. How to you recommend me to build a simple static website(maybe just a few pages) for my mobile app ? Any template or tool recommendation ? Thanks a lot.

    Read the article

  • How to use multiple search keys?

    - by user32565
    I have a database wherein the files are named abcd100.00b, abcd101.00b, etc. I need a code where when the user enters abcd separate then 100 to 110, all the files with the name abcd and in the range 100 to 110 should get displayed now the following code can display only the first four characters. How do I implement this? <?php //capture search term and remove spaces at its both ends if the is any $searchTerm = trim($_GET['keyname']) ; //check whether the name parsed is empty if($searchTerm == "rinex_file") { echo "Enter name you are searching for."; exit(); } if($searchTerm == "rinex_file") { echo "Enter name you are searching for."; exit(); } //database connection info $host = "localhost"; //server $db = "rinex"; //database name $user = "m"; //dabases user name $pwd = "c"; //password //connecting to server and creating link to database $link = mysqli_connect($host, $user, $pwd, $db); //MYSQL search statement $query = "SELECT * FROM rinexo WHERE rinex_file LIKE '%$searchTerm%'"; $results = mysqli_query($link, $query) ; /* check whethere there were matching records in the table by counting the number of results returned */ if(mysqli_num_rows($results) >= 1){ echo '<table border="1"> <tr> <th>rinex version</th> <th>program</th> <th>date</th> <th>maker name</th> <th>maker number</th> <th>observer</th> <th>agency</th> <th>position_X_Y_Z</th> </tr>'; while($row = mysqli_fetch_array($results)){ echo '<tr> <td>'.$row['rinex_version'].'</td> <td>'.$row['pgm'].'</td> <td>'.$row['date'].'</td> <td>'.$row['marker_name'].'</td> <td>'.$row['marker_no'].'</td> <td>'.$row['observer'].'</td> <td>'.$row['agency'].'</td> <td>'.$row['position_X_Y_Z'].'</td> </tr>'; } echo '</table>'; }else{ echo "There was no matching record for the name " . $searchTerm; }

    Read the article

  • Domain mapping issues

    - by Nadya
    I have two domain names - .com & .co.uk bought with 123-reg and just one student Windows hosting pack associated with the .co.uk domain. The .com domain is the main one which people would be trying to access, so I just mapped the domain to the hosting this morning. The problem is that I would really like it to be functional by tomorrow morning and the usual waiting time is 24-48 hours. Is there point in stopping the process and trying with forward it with CNAME record instead, does it take less time? (I can just go back and do proper domain mapping during the weekend) Also, is there a possible way to check whether the domain mapping has been done correctly before these 24-48 hours? From some computers I get 404 Error on homepage.

    Read the article

  • Google indexed the same page under two URLs (despite rel-canonical)

    - by unor
    The Super User question "Playing mp3 in quodlibet displays “GStreamer output pipeline could not be initialized” error" is indexed under two URLs in Google: http://superuser.com/questions/651591/playing-mp3-in-quodlibet-displays-gstreamer-output-pipeline-could-not-be-initia http://superuser.com/questions/651591/playing-mp3-in-quodlibet-displays-gstreamer-output-pipeline-could-not-be-initia/652058 The first one is the canonical one; the corresponding rel-canonical is included in both pages: <link rel="canonical" href="http://superuser.com/questions/651591/playing-mp3-in-quodlibet-displays-gstreamer-output-pipeline-could-not-be-initia" /> Google also indexed http://superuser.com/a/652058, which redirects to the answer: http://superuser.com/questions/651591/playing-mp3-in-quodlibet-displays-gstreamer-output-pipeline-could-not-be-initia/652058#652058 Now, the second URL from above is the same as this one minus the fragment #652058. So Google seems to strip the fragment, which results in exactly the same page under another URL (= containing the answer ID /652058 as suffix), and indexes it, too -- despite rel-canonical and duplicate content. Shouldn’t Google recognize this and only index the canonical variant? And what could be the reason why Stack Exchange includes the answer ID in the URL path, and not only in the fragment (resulting in various URL variants for the same page)?

    Read the article

  • will main domains be more seo friendly than subdomains?

    - by C graphics
    Web hosting providers offer services such as hosting multiple domains in one account. Then my concern is about seo friendliness. say the main domain of my account is maindomain.com on which I have added an addon domain say domain2.com. That means cpanel will generate domain2.maindomain.com and the contents of domain2.com will be practically stored into a subfolder in maindomain.com. Now, assume both maindomain.com and domain2.com have same structure both optimized for seo same way. My question is that would maindomain.com links be more seo friendly due to that fact that maindomain.com is the mani domain of my account?

    Read the article

  • Making a language switch main menu button in Drupal

    - by Let_Me_Be
    I have a bilingual site in Drupal. The problem is that I hate the language switch block taking up so much space (sometimes the only thing in the sidebar is the language switch block). So what I would love to have is language switch menu item, that would point to the other language (other then the current one). Something like this: | Home | Projects | BlaBla | | Cesky | after swith: | Domu | Projekty | Blabla | | English | Is that possible without writing a whole new module?

    Read the article

  • Creating sites with local ips that pointing to a distant server.

    - by fatnjazzy
    Hi. We are a company that is distributed in several places over Europe (real offices). Each office has its own domain. company.de company.co.uk company.ch And so. Our website servers are located in one place. We can't distribute our site to different locations. How can we create a local IP in each location to show our main server. so google will see us as local ip. Explanation: Google has decided to increase your PR if you have a local IP, they think that if you bought a server in a local market means that you are very serious about your business. We have 8 employees in each office, we cant have a separate server, is that mean that we are not serious about our business? no, this is y i need to create this illusion. Thanks

    Read the article

  • What's the current wait time for reconsideration requests for Google's webmaster tools?

    - by chrism2671
    We recently received an unnatural links penalty to our site; a rogue SEO firm did us some serious damage, and we lost 40% of our traffic (hundreds of thousands of users) overnight. The effect on our business has been severe and we're really hoping we making things right. We submitted a reconsideration request but I'm wondering how long I should forecast for an outcome, as it will have a knock on effect for our business.

    Read the article

  • Google Sitemap Limits?

    - by Anonymous -
    I've read in multiple places that Google's sitemap limits sits at 50,000 URLs per sitemap - though it's my understanding that you can submit multiple sitemaps to overcome this problem. I've also found that Google follow the sitemap protocol found here. My question is - is there anywhere where Google directly comment on the specifications and limits of sitemaps they accept? All the information I've found isn't behind any Google domain.

    Read the article

  • Google map in MediaWiki not showing

    - by user67656
    I have upgraded MediaWiki from 1.9.3 to 1.16.1 in a new server. However, the google map is not showing in the link. It's a blank in that page but in the old server with old version it is working fine. I am not a developer so I have no clue on this. Please let me know anybody have any idea on this. you can have a look on the below links http://new.realchicago.org/wiki/index.php/Archer_Heights The first link in which the google map is missing.

    Read the article

  • Redirection & SEO related stuff while moving to a new blog

    - by Karshim Kanwar
    I have a WordPress blog and recently I have setup a new blog lets call the old blog as blog old and new blog as blog new. What I did is moved the content, photos, pictures and all 250 posts from blog old to blog new. Both the blog name are changed as they are pointing to different domain names! I read helpful things in this site itself at here. I will no longer use blog old, moreover I am concerned about the SEO of the blog new. The blog new is fairly new (just 24 hours and no pages have been indexed in Google). I have done the following stuff: Deleted all the post share at Facebook fan Page, Twitter profile, Google+ page and Finally deleted the fan page/Twitter, Google+ page. Edited the link backs of old blog in the blog new. The question I have is: How do I prevent duplicate content issues? Do I go straightaway and delete all the posts in blog old? Should I start sharing the blog posts in blog new? Should I submit the new site to Webmaster Tools or wait for few weeks? Every comment here is appreciated! What issues can I face relating to SEO?

    Read the article

  • Is there a (free) reliable place to get statistics from sites, more reliable than Alexa, Quantcast, Compete?

    - by S.gfx
    I mean, seems there's no way. I am just asking in case someone knows of a recent new site being more accurate. I am aware of Alexa's, Compete and Quantcast inaccuracies and/or limited system/range of sites to get their stats. I also know about websitegrader perhaps being a little more accurate (although not sure if that's the data I am after). And read Seomoz tools are reliable. I am yet though looking for a free solution, a 'reliable' Alexa. And not a place depending on a toolbar installation, an easy to trick place, or one with stats way too off, or of a very limited range of sites. I am almost sure there's nothing new, but I wanted to be sure.

    Read the article

  • Can Adwords be cancelled by Google because of improper IE6 site rendering

    - by user745434
    A client just got a notice from Google saying that their Adwords campaign has been put on hold because the site is: Improperly rendering or Under constructions or Needs a special program to run Now the site is improperly rendering on IE6. On everything else, including IE7+ it's fine. If this is the issue, would putting up a "Looks like you're using an older browser" message instead of the site for IE6 be a solution? Or must the site look good in IE6 for the Adwords campaign to continue?

    Read the article

  • Working with Google Webmaster Tools

    - by com
    My first question is about Crawl errors in Google Webmaster Tools. Crawl errors is devided into few sections. One of them is HTTP. I assume that all broken links in HTTP was somehow found by crawler, this is not the links from sitemap. If this was found by scanning all sitemap pages for links, why it doesn't mention what was the source page, like in sitemap section with column Linked From. And what the meaning of Linked From, I thought if the name of section is sitemap, therefore all URLs should be taken from sitemap, so why there is Linked From? The second question, what is the best way to trreat searching on the site. How come the searching result page are getting indexed? Because of the fact that all searching result page are getting indexed, I have to many page in Linked From. What's the right practice? Question three: In order to improve response time in WMT, can I redirect all crawler's requests to designated free web server? Is this good practice? Question four: How should I treat Google Analytics Code (with parameters PageView, PageLoadTime), in the case user request non existing page, should I render Google code or not? Right now I use Google Analytics Code on the common template page, such that every page, also non existing page with error message contains Google Analytics Code, it seems like it has influence on WMT.

    Read the article

  • Summary of usage policies for website integration of various social media networks?

    - by Dallas
    To cut to the chase... I look at Twitter's usage policy and see limitations on what can and can't be done with their logo. I also see examples of websites that use icons that have been integrated with the look and feel of their own site. Given Twitter's policy, for example, it would appear that legal conversations/agreements would need to take place to do this, especially on a commercial site. I believe it is perfectly acceptable to have a plain text button that simply has the word "Tweet" on it, that has the same functionality. My question is if anyone can provide online (or other) references that attempt to summarize what can and can't be done when integrating various social networks into your own work? The answer I will mark as the correct one will be the one which provides the best resource(s) giving the best summaries of what can and can't be done with specific logos/icons, with a secondary factor being that a variety of social networking sites are addressed in your answer. Before people point to specific questions, I am looking for a well-rounded approach that considers a breadth of networks and considerations. Background: I would like to incorporate social media icons and functionality, but would like to consider what type of modifications can be done without needing to involve lawyers. For example, can I bring in a standard Facebook logo, but incorporate my site color into the logo? Would the answer differ if I maintained their color, but add in a few pixels of another color to transition? I am not saying I want to do this, but rather using it as an example.

    Read the article

  • Joomla: Prevent certain user from visiting a certain page

    - by MrB
    I have an internal section in my Joomla only for "Registered" users. There they can edit preferences. Multiple users can access the same preferences (different stakeholders). Now I have "smithcorp" and "smithcorp_guest". They should both be able to view the same things, but "smithcorp_guest" shouldn't be able to edit the preferences. Is there a way to forbid "smithcorp_guest" access to this page? I have only seen access regulation via the user level, i.e. unregistered, registered, admin. Thanks, MrB

    Read the article

  • Creating foreign words' learning site with memory technique (Web 2.0)? Will it work?

    - by Michal P.
    I would like to earn a little money for realizing a good, simple project. My idea is to build a website for learning of chosen by me language (for users knowing English) using mnemonics. Users would be encourage to enter English words with translation to another language and describing the way, how to remember a foreign language word (an association link). Example: if I choose learning Spanish for people who knows English well, it would look like that: every user would be encourage to enter a way to remember a chosen by him/her Spanish word. So he/she would enter to the dictionary (my site database) ,e.g., English word: beach - playa (Spanish word). Then he/she would describe the method to remember Spanish word, e.g., "Image that U r on the beach and U play volleyball" - we have the word play and recall playa (mnemonics). I would like to give possibility of pic hotlinks, encourage for fun or little shocking memory links which is -- in the art of memory -- good. I would choose a language to take a niche of Google Search. The big question is if I don't lose my time on it?? (Maybe I need to find prototype way to check that idea?)

    Read the article

  • Google analytics not provided for 55% of total traffic

    - by Neolisk
    I've been here and here to learn what (not provided) means. Now the question is if what I am seeing in my Google Analytics stats for my website is considered normal (and whether I can/should do anything about it). Here are the statistics from one day, but other days are similar: 102 visits, 57 is from (not provided), that's over 55% of unknown keywords. Is it normal to have it like that? Does google plan to do anything about it? In other words, what's the perspective? In my understanding, with this approach, as people switch to https, Analytics will stop being useful. Please correct me if I am wrong in my assumptions.

    Read the article

  • SEO, IIS 7 and web.config in subfolder issue

    - by tesicg
    We have ASP.NET application that has sub-folder with .aspx pages and separate web.config file in it. The .aspx pages in that sub-folder behave as separate site. In the web.config file at application level, I set the rule that removing trailing slashes: <rewrite> <rules> <rule name="RemoveTrailingSlashRule1" stopProcessing="true"> <match url="(.*)/$" /> <conditions> <add input="{REQUEST_FILENAME}" matchType="IsDirectory" negate="true" /> <add input="{REQUEST_FILENAME}" matchType="IsFile" negate="true" /> </conditions> <action type="Redirect" redirectType="Permanent" url="{R:1}" /> </rule> </rules> </rewrite> I expect this rule will propagate downward to sub-folder as well. To access the site in sub-folder we should type: http://concert.local/elki/ and get it without trailing slash as: http://concert.local/elki But, the trailing slash remains. The web.config file in sub-folder looks as following: <configuration> <system.webServer> <defaultDocument> <files> <add value="Sections.aspx" /> </files> </defaultDocument> </system.webServer> </configuration>

    Read the article

< Previous Page | 229 230 231 232 233 234 235 236 237 238 239 240  | Next Page >