Search Results

Search found 13195 results on 528 pages for 'technical trainer pro'.

Page 240/528 | < Previous Page | 236 237 238 239 240 241 242 243 244 245 246 247  | Next Page >

  • Changing hosts but keep old emails [closed]

    - by LDaniel
    Possible Duplicate: Change host / keep emails OK here's the situation...I'm trying to transfer my domain and email address to a new hosting service and I would like to start using Google's domain apps for my email, etc. My email address is currently on a WebMail type platform and when I move my domain and start using Google domain apps I would also like to keep the old emails and have them imported the the email address on Google. Note: The email address will be the same on both hosts. For example [email protected]. So its keeping the same address between two different systems. I've been Googling how to do this for awhile and all the migrate options that come up don't appear in the the Google inbox setting or domain apps config settings. Any help is greatly appreciated.

    Read the article

  • How do I use Instagram to post to my business page but not my personal page?

    - by DEdesigns57
    I tried asking this question on the Facebook forums but got no answer, maybe someone here might be able to help me with this. I have a Facebook account and within that same account is my Facebook page for my business. Problem is every time that I try to use Instagram to upload a photo it gets uploaded to my personal/primary Facebook page and not the business page which is under the same email address and password. Instagram ask for this email address and password but how can I just have Instagram upload it to my business page and not my personal page? Or at very least how can I do both?

    Read the article

  • How should I structure a site with content dependent on visitor type (not user)?

    - by Pedr
    I have a website that displays different content depending on two selections made by a visitor: Whether they are a teacher or student, and their learning level (from 4 options). Everything is public and they don't need to authenticate to access the content. Depending on their selection, different content is displayed across the whole site, other than a contact and about page. The tone of the language changes depending on whether the visitor is a student or teacher and the materials available on each page also change depending on the learning level, however in all cases, the structure of the site is identical. Currently I'm using a cookie to store the visitor's selections and render different content appropriately, so I have a single set of URLs which display different content depending on the cookie, with one of the permutations as default. I appreciate this is far from ideal, but what is the better option? Would I be better using a distinguishing segment for each selection, for example: http://example.com/teacher/lv3/resources/activities http://example.com/teacher/lv4/resources/activities http://example.com/student/lv4/resources/activities etc. What is the most sensible way to handle this situation?

    Read the article

  • Premium Cpanel Accounts Give Away

    - by mamta
    I found Book My Cloud.com - Offering FREE Cpanel Hosting accounts Method to request is contacting their support team, Request Now: hxxp://support.book my cloud.com/index.php?a=add Select Request Type Free Cpanel Hosting Related They provide, Instant Activation Disk quota : 10 GB Monthly bandwidth : 300 GB Max FTP Accounts : 5 Max Email Accounts : Unlimited Max Email Lists : Unlimited Max Databases : 500 Max Sub Domains : 500 Max Parked Domains : 100 Max Addon Domains : 1000 Control Panel: Cpanel 2012

    Read the article

  • How do non-coders do simple local templating to avoid redundant HTML? [closed]

    - by Max Cantor
    I'm a web developer. When I start designing a site, I use a framework to handle templating for me, even if it's just rack + erubis. What do non-developers do? If you want to implement a site in HTML and CSS without a framework running on a webserver, without frames, and without WYSYWIG tools like Dreamweaver... how do you avoid copy-and-pasting the HTML of your navigation (for example) on every single page you're writing? I feel stupid asking this because it seems like their must be an obvious answer, but for the life of me, I can't think of one right now.

    Read the article

  • How an offline main domain can influence traffic on an active sub domain

    - by danie7L T
    The website(s) design is for a company active in 3 different areas. As an example lets use the following structure: www.example.com [sub1.example.com] [sub2.example.com] [sub3.example.com] sub2.example.com and sub3.example.com are ready to go live but www.example.com really isn't and send a 503 http error code. I would like to know if this situation will affect the traffic and ranking of the subdomains ready to go live? Is it preferable to wait and go live with the main domain? Or there is nothing to "fear" and one doesn't affect the other? Thank you

    Read the article

  • My Herokuapp is inaccessible from custom domain name

    - by picardo
    I have a Heroku app that is located at myapp.herokuapp.com. I have mapped a domain name to this app, using the A properties. I followed the instructions on Heroku's website to the letter, and it worked for a few days. Now when I try to access the site from the custom domain, it's timing out! On Chrome, I am getting "Oops! Google Chrome could not find that page!" message. I tried pinging the name as well, but I got this error: ping: cannot resolve yourhostname.org: Unknown host The app itself is working and I don't see any error messages from Heroku. Or from new Relic. What's going on here? Also tried running host and this was the error message: ;; connection timed out; no servers could be reached

    Read the article

  • https (SSL) instead of http

    - by user1332729
    I am building myself a new website, out of privacy and security concerns I am contemplating trying to make it https only. It will be mobile-friendly using media queries but I am concerned--especially for mobile users--about the increased bandwidth. How much will doing so increase my bandwidth or slow load times? For pages where I'm not transferring sensitive information, should I leave external links (to a jQuery library, or a web font for instance) in http? Simply put, I have read articles saying the entire web would be more secure if everything was SSL but my actual knowledge of implementation is limited to payment gateways and log-in pages and such. I apologize for the open-ended nature of the question but anything, even just simple answers to the specific questions is welcomed.

    Read the article

  • what is optimum length for html title tag in Unicode format?

    - by user1501256
    I have a website that generates its title tag dynamically. the title tag is in unicode format. the title tag is limited to 65 character but sometimes Google doesn't show title tag completely in SERP. I'd like to know what is the optimum length of title tag in terms of seo for unicode titles, and is there any difference between Unicode title and non-Unicode title tag? And what about other search engines Bing, Yahoo and so on.

    Read the article

  • Why do Google search results include pages disallowed in robots.txt?

    - by Ilmari Karonen
    I have some pages on my site that I want to keep search engines away from, so I disallowed them in my robots.txt file like this: User-Agent: * Disallow: /email Yet I recently noticed that Google still sometimes returns links to those pages in their search results. Why does this happen, and how can I stop it? Background: Several years ago, I made a simple web site for a club a relative of mine was involved in. They wanted to have e-mail links on their pages, so, to try and keep those e-mail addresses from ending up on too many spam lists, instead of using direct mailto: links I made those links point to a simple redirector / address harvester trap script running on my own site. This script would return either a 301 redirect to the actual mailto: URL, or, if it detected a suspicious access pattern, a page containing lots of random fake e-mail addresses and links to more such pages. To keep legitimate search bots away from the trap, I set up the robots.txt rule shown above, disallowing the entire space of both legit redirector links and trap pages. Just recently, however, one of the people in the club searched Google for their own name and was quite surprised when one of the results on the first page was a link to the redirector script, with a title consisting of their e-mail address followed by my name. Of course, they immediately e-mailed me and wanted to know how to get their address out of Google's index. I was quite surprised too, since I had no idea that Google would index such URLs at all, seemingly in violation of my robots.txt rule. I did manage to submit a removal request to Google, and it seems to have worked, but I'd like to know why and how Google is circumventing my robots.txt like that and how to make sure that none of the disallowed pages will show up in their search results. Ps. I actually found out a possible explanation and solution, which I'll post below, while preparing this question, but I thought I'd ask it anyway in case someone else might have the same problem. Please do feel free to post your own answers. I'd also be interested in knowing if other search engines do this too, and whether the same solutions work for them also.

    Read the article

  • Advertisement programs that allow "clickjacking" (earning advertisement revenues by popups generated by clicks on the website)?

    - by Tom
    Whether clickjacking is an ethically responsible way of earning advertisement revenues is a subjective discussion and should not be discussed here. However, it appears that quite a lot of popular sites generate "popups" when you click either of their links or buttons. An example is the Party Poker advertisement (I am sure many of you will have seen this one). I wonder though, what kind of advertisement companies allow such techniques? Surely Google Adsense does not? But which do, and are they reliable partners?

    Read the article

  • mailing for categories

    - by nerkn
    There are more than 10 categories in my site, users can register more than one category. My php script prepare contents of each category, according to user preference my script merge those contents for each user. so every user can get what category they want. Problem is I want to send 340+ mails per hour and dreamhost dont allow. What do you suggest? I think to a service like mailchimp but I couldnt find that scenario, do they support category & content etc. Can I use smtp in dreamhost?

    Read the article

  • Frequency to submit sitemap to search engines

    - by user577691
    i have went live with my site and being new to search engines and SEO fields not sure what should be the best way to handle sitemap.xml. I have created sitemap.xml and submitted it to Google using webmaster tool Yahoo/Bing using Bing Webmater Ask.com now since site will get updated every 2-3 times per week i am not sure what should be the best approach. Do i need to submit sitemap.xml again If i need to submit sitemap.xml again and again what should be the frequency to submit that Please suggest the best approach

    Read the article

  • I'm seeing a lot of a new format for title elements for major sites: a few tags, and only then the title of the site itself, is this now optimal SEO?

    - by Tchalvak
    I'm seeing a lot of sites go to a format similar to the one now in use by stackexchange sites (i.e. short topic - maybe a long topic - only finally the site's name). Is this optimal SEO? It seems kind weird both because I'm only begun to notice changes in this direction recently, and because it seems like it would make it harder to tell in search results which actual site you're visiting, even if the topic is one that matches. Still, sites like Facebook, StackOverflow, etc probably aren't wrong, so I'm wondering if I should try to make my sites use that format, going forwards...

    Read the article

  • Have search engines recognise multi-lingual sites

    - by mark
    Hello. I have organised my site by language using the following structure: www.domain.com #spanish language version www.domain.com/en #english language version The site is of Spanish subject matter hence Spanish language takes priority at root of domain. Although the site is new at this stage I would hope visiting from google.com and google.es would return English and Spanish versions respectively. Are there any particular steps I need take to separate the two. Should I add them both as individual sites in Google's webmaster tools and also should I submit individual site maps for each 'site' or as a whole? Thanks in advance.

    Read the article

  • Will my traffic come back after my site redesign?

    - by Steve
    I screwed up. I launched my site after rebuilding it without setting up the proper 301's and traffic immediately dropped about 60%(it's not really something I thought about). After about a week and a half, I set the 301's back up yesterday and resubmitted my sitemap to google. Google has yet to index the whole thing, but traffic isn't getting any better. Is it likely to come back? If so, I. How long? Has this happened to you? Any info is appreciated. I am really anxious!

    Read the article

  • Visual sitemap generater

    - by rugbert
    Im looking for a something to visually create a sitemap for one of my websites. Id like something in a tree structure, so I have the hierarchical view of my site. A couple requirements I have tho, the ability to map password protected pages, and (not REALLY a requirement) the ability to integrate google analytics data. Im trying a evaluation version of powermapper, but the version that includes analytics integration is like $300 so Im looking for something cheaper.

    Read the article

  • RewriteRule for URL Subdirectory Root

    - by JYerdon
    Have not found this in my searches on SE. I need this scenario to work: • User visits someurl.com/news/folder or someurl.com/news/somefolder/, they get redirected to someurl.com/somefolder. • If the user visits JUST someurl.com/news or /news/, they are allowed through to visit /news. Here is my current rule: RewriteRule ^news/(.*) /$1 [NC,R=301,L] How do I make it allow the second bullet point? First seems to work with no issues. Thanks all! POST UPDATE I have got the code RewriteCond %{REQUEST_URI} ^news RewriteRule ^/news news/ [NC,L] RewriteCond %{REQUEST_URI} ^/news/(.)$ RewriteRule ^news/(.) /$1 [NC,R=301,L] BUT - it doesn't allow me to go to the URL something.com/news/ Any thoughts?

    Read the article

  • Help with google pages; using them in general

    - by Sugarcube
    Is there any way someone could help me with building a website using google pages? I cannot for the life of me figure out how to do much more than create a new page, and I have been up all night. I thought I knew computers and websites but apparently I need a refresher course. Please, if you can help, I would be most appreciative. -Sugarcube Edit: Seems this is too vague. But honestly, I don't even know where to begin with the way Google has it set up. I tried to set one up with a template and a layout, but the layout has pages that I did not create and that I don't know how to edit to make them suitable to my website. I guess if I had to start somewhere it would be on how to make it do what I want it to do.

    Read the article

  • Seo - page similarity percentage

    - by user1360479
    Using Similar Page Checker (http://www.webconfs.com/similar-page-checker.php) you can check if a website is similar to other one. Is there any rule of thumb how high percentage is accepted by Google. Meaning when Google consider that page is similar than other one and will not index it. I' having two pages within same domain where "how to order" -information is similar and that's why percentage is about 70. Thx

    Read the article

  • how does private sales ecommerce site work on their SEO?

    - by 142857
    In a private sales ecommerce site, users need to sign up/in before they can access the pages of website. So, even if a user tries to directly navigate to a product page, he is redirected to sign in. I am wondering then how does these sites manage their SEO, as it would imply google too can't crawl these pages, or do they completely ignore the SEO benefit of allowing google to crawl the product and catalogue pages?

    Read the article

  • What is the average page size for single page application (SPA)? [on hold]

    - by Emmanuel Istace
    I'm developing a single page application with a lot of css & javascript. For now the page is 1.3Mo composed by 5 section. Here are the rounded stats : Document : 10kb Style : 60kb Images : 450 kb (already compressed, include a big gallery thumbnails) Javascript : 700kb - 600kb of "framework" (jquery, jquery-ui, boostrap, modernizer, waypoint, ...) and 100kb of custom js. Fonts : 125kb And the site is not finished yet. (Will include gmap api, and some others...) My questions are : Do you have any statistics about the average weight of an SPA? As this is the whole website, do you think it's acceptable? Is lazy load (for images) a solution? What will be impact for SEO ? Is the "200kb rule" of google still relevant? Do you know great tools to detect which javascript code is not used during the the exection of a page and then the availability to optimize these 700kb of framework js stuffs? Can a caching strategy be an answer?

    Read the article

  • How do I prevent ISPs from killing downloads of files in mid-transfer?

    - by Gorchestopher H
    I run a small website with a few users, low traffic, mostly to share personal mp3 files with a small community. Depending on their ISP, my users can't always download or stream larger files. By larger I mean larger than 1MB. Essentially the host either stops sending, or the client stops receiving. One of the links along the connection chain simply ends its connection before the transfer completes Trace-route shows no connection issues. There are no connection issues with short transfers that don't take more than a few seconds. It's these 10 second transfers that just end up ending. Just doing a straight download with a direct link can yield this error if you have the wrong ISP. Strangely enough, this is most common with users with ISPs who are essentially independent providers that buy service via a fiber link. Unfortunately these providers aren't very knowledgeable, are unable to do any testing, and insist it's a problem with the host. I have gotten my host to transfer my site to different servers of their, to the same effect. Nearly identical sites (affiliate sites actually) experience no such issue. What can I be doing to further troubleshoot this matter? How can I prove that someone is dropping the ball, and identify who that party is? Can I do a 5Mb traceroute? EDIT Maybe I can clear up some misconceptions with my question: The files are not very large. They are simply over 2Mb. The users do not have "slow" connections, they are at least 5mbps. This "time out" happens very quickly, in the realm of 5 seconds, so I don't know if it's a timeout or not. The user often gets 1 or 2Mb in this chunk of time. I have tried streaming with a flash player. I have tried saving the target. Forcing the download. I have tried allowing the browser to stream the file. I have tried different browsers (FF, IE, Chrome). Users are able to download identical files when on different hosts.

    Read the article

  • What is required to create local business rich-snippets complete with sitelinks AND breadcrumbs?

    - by Felix
    I have a local business directory site. I would like to markup my business listing 'profile' level pages for display as enhanced listings/rich-snippets complete with business names, addresses and phone numbers. I would also like to display site-links and path-based breadcrumbs to help users navigate site directory hierarchy (which is deep). Is there a limit to the amount of breadcrumbs a site can leave? Is there a separate limit on the number of breadcrumbs which Google/Bing will display in the SERP? What kind of markup language(s) would be needed to best position my site to show site-links AND breadcrumbs? For example: Find a business Browse by Location State City Zip or Find a business Choose Service Browse by location State City Thanks all!

    Read the article

  • Visits-PageViews-Bounce Rate-New Visitors-Visit Duration (Google Analytics), which one is top priority for seo?

    - by HOY
    This is the case: My site is getting a lot of trafic from an image (a company logo image) because this image is ranked 1.st in google search results for a company's title. (I have no idea how that happened) This image is must for my website, but it is not relevant with site content so irrelevant people search for the image and finds out about my site, so that I get interesting statistics: http://postimage.org/image/3oyvrjoz9/ Pros: Total Visits & Avg. New Visits Cons: Avg. Page/Visit, Avg. Visit Duration, Bounce Rate In summary I am confused if this image is helpful to my website ? Because I don't know the balance between those 5 statistics P.S: My website is 2 months old, and we are working on seo at the moment Another P.S: Kindly ask you to not provide assumtions, because I also have assumptions, I need real knowledge. Edit: Search Keyword is: arcelik logo Search Site: google.com.tr Search URL: https://www.google.com.tr/search?hl=en&q=arcelik+logo&bav=on.2,or.r_gc.r_pw.r_qf.&bvm=bv.41524429,d.Yms&biw=1366&bih=667&um=1&ie=UTF-8&tbm=isch&source=og&sa=N&tab=wi&ei=oZIDUfutAseVswa9zYHwCw

    Read the article

< Previous Page | 236 237 238 239 240 241 242 243 244 245 246 247  | Next Page >