Search Results

Search found 19186 results on 768 pages for 'sharepoint search'.

Page 159/768 | < Previous Page | 155 156 157 158 159 160 161 162 163 164 165 166  | Next Page >

  • How much of a benefit does a changing landing page give in terms of SEO?

    - by Glycan
    I have a friend with a small business with a website. He asked me if he should make a put a section on his landing page under the fold that with his most recent review (or something along those lines). Specifically, he wants to know if that's the most efficient use of his time. Is there a list or such of things google values compared to each other, so that these kinds of answers could be easily answered?

    Read the article

  • How to specify importance of html elements?

    - by Julien Fouilhé
    Is it possible to specify what elements of the page are important, or, more specifically, what elements of the page are not important? I'm using HTML5 new elements (nav, header, footer, section, article, aside...), but in the description of the website, there's sometimes my login form (in the header of my page though) in the Google description of my website pages... Is there a solution to resolve this problem? Thank you.

    Read the article

  • Request Removal of naked domain from Google Index

    - by Pedr
    I have a site which was temporarily available at both example.com and www.example.com. All traffic to example.com is now redirected to www.example.com, however during the brief period that the site was available at the naked domain, Google indexed it. So Google now has two versions of every page indexed: www.example.com www.example.com/about_us www.example.com/products/something ... and example.com example.com/about_us example.com/products/something ... For obvious reasons, this is a bad situation, so how can I best resolve it? Should I request removal of these pages from the index? There is still content at these URLs, but they now redirect to the www subdomain equivalent. The site has many hundreds of pages, but the only way I can see to request removal is via the Remove outdated content screen in Webmaster Tools, one URL at a time. How can I request removal of an entire domain (ie. the naked domain) without it effecting the true site located at the www subdomain? Is this the correct strategy given that all the naked domains now redirect to their www equivalent?

    Read the article

  • Sharepoint: authenticating users via forms authentication

    - by sbee
    My problem is the following(sharepoint Newbie) , i want to change the default zone from being a Windows Authenticated Zone to a Forms Authenticated Zone ,thereby forcing the site collection administrator to log in via forms authentication and not windows also the sharepoint users will be accesing the site internally my goal is to effectively replace windows authentication with forms authentication as my company does not have active directory installed. So far i have created an ASP Application that adds the users to the database,the database was created via the .Net Framework Asp tool(Asp reg_sql),however when i change the default zone to the AspNetSqlMembershipProvider(Forms) and attempt to add my site collection administrator via the Central admistrator, i get the following error "No Exact Match found" as shown on the screenshot. My inkling is that somehow the people picker is failing to read the users from the database but reasearch on correcting that thus far has proved fruitless. I have made all the relevant changes on the these sites(Central admin site,My test site & Add Users site) config files.Changes are the following(Membeship Provider,Connection String,People Picker) i left out the role provider for now as it is optional. Help on this would ge highly appreciated...

    Read the article

  • Is there a way to force Windows to recognize a network folder as a local drive, for the purposes of

    - by NoCatharsis
    I just started using the file search program Everything at work to search through documentation on our shared drives. This is after disappointments with Google Desktop and Windows Search. I love the speed of Everything, but I wish it were able to index other shared folders. My makeshift solution was to somehow force Windows to recognize the necessary shared folders as local drives, then add them to the index list. I have also considered using SyncToy, but this requires downloading all data to my drive, which could be terabytes of information - obviously not a good idea on a small company network. What would be the best solution here?

    Read the article

  • Is it good to use same keyword for multiple pages in one domain?

    - by Phanen
    Hi, I want to know after Google recent updates, will it be good opt to use same keywords for multiple pages? Say- my keyword is "driver update" and I have a folder "HP" in website. Hp has lots of models like HP Elitebook, HP Envy, HP Mini, HP Pavilion and much more. HP Elitebook has many versions like HP Elitebook 2530p, HP Elitebook 2730p, HP Elitebook 8530w. Now should I create pages like "Driver update in HP Elitebook 2530p" , "Driver update in HP Elitebook 2730p", "Driver update in P Elitebook 8530w" or a single page "Driver update in HP Elitebook"? Which one will be the best option for better SE ranking- " single page or multiple pages using same keyword for different model versions" ?

    Read the article

  • Google indexed the same page under two URLs (despite rel-canonical)

    - by unor
    The Super User question "Playing mp3 in quodlibet displays “GStreamer output pipeline could not be initialized” error" is indexed under two URLs in Google: http://superuser.com/questions/651591/playing-mp3-in-quodlibet-displays-gstreamer-output-pipeline-could-not-be-initia http://superuser.com/questions/651591/playing-mp3-in-quodlibet-displays-gstreamer-output-pipeline-could-not-be-initia/652058 The first one is the canonical one; the corresponding rel-canonical is included in both pages: <link rel="canonical" href="http://superuser.com/questions/651591/playing-mp3-in-quodlibet-displays-gstreamer-output-pipeline-could-not-be-initia" /> Google also indexed http://superuser.com/a/652058, which redirects to the answer: http://superuser.com/questions/651591/playing-mp3-in-quodlibet-displays-gstreamer-output-pipeline-could-not-be-initia/652058#652058 Now, the second URL from above is the same as this one minus the fragment #652058. So Google seems to strip the fragment, which results in exactly the same page under another URL (= containing the answer ID /652058 as suffix), and indexes it, too -- despite rel-canonical and duplicate content. Shouldn’t Google recognize this and only index the canonical variant? And what could be the reason why Stack Exchange includes the answer ID in the URL path, and not only in the fragment (resulting in various URL variants for the same page)?

    Read the article

  • SEO and JavaScript since Google admits JS parsing

    - by schlingel
    We're planning on building a HTML snapshot creation service to provide the Google crawlers with static HTML of our JS driven single page application. Is this still necessary and/or encouraged since Google openly admits it is parsing JS now? How should I tackle this evaluation? Are there tools to provide data on when it's needed to provide snapshots and when google has sufficent parsing? Is it better because it would be much faster in comparison to the JS incremental rendering?

    Read the article

  • SharePoint Search Problem: The start address sps3://server cannot be crawled.

    - by Clara Oscura
    With this post, I'm going to start a series on problems I have encountered with SharePoint search. Error: The start address sps3://luapp105 cannot be crawled. Context: Application 'Search_Service_Application', Catalog 'Portal_Content' Details:  Access is denied. Verify that either the Default Content Access Account has access to this repository, or add a crawl rule to crawl this repository. If the repository being crawled is a SharePoint repository, verify that the account you are using has "Full Read" permissions on the SharePoint Web Application being crawled.   (0x80041205) (Event ID: 14, Task Category: Gatherer) Solution: give appropriate permissions to User Profile Synchronisation Service http://social.technet.microsoft.com/Forums/en-US/sharepoint2010setup/thread/64cdf879-f01e-4595-bc52-15975fefd18d http://www.dotnetmafia.com/blogs/dotnettipoftheday/archive/2010/03/29/how-to-set-up-people-search-in-sharepoint-2010.aspx

    Read the article

  • Javascript Tips and Tricks

    - by ybbest
    1. Replace all , in one Javascript string. var totalAmount= "100,000,000,000"; var find= ","; //Replace the first , with the empty string var replace=""; totalAmount= totalAmount.replace(find,replace); alert(totalAmount); var totalAmount2= "100,000,000,000"; var newFind=/,/g //Replace all , with empty string totalAmount2= totalAmount2.replace(newFind,replace); alert(totalAmount2);

    Read the article

  • Why the Indian link builders or SEO companies can make so many high quality links at the same time? [closed]

    - by chiba
    There are a lot of Indian SEO companies or link builders that offer a lot of high quality link. Some of them for example offer links just from "co.uk" or "French site" with high page ranks. I have heard that even the SEO companies from other countries outsource link building to India. Do they have special connections for building links ? or Do they exchange the information between another Indian companies and have a big database of the sites where they can link?

    Read the article

  • Google’s Zeitgeist 2012: A Year In Review

    - by Jason Fitzpatrick
    Once a year Google releases their Zeitgeist–an overview of what the world was searching for during the previous year. Check out the year in review video and then browse the entire project. Google Zeitgeist 2012 Secure Yourself by Using Two-Step Verification on These 16 Web Services How to Fix a Stuck Pixel on an LCD Monitor How to Factory Reset Your Android Phone or Tablet When It Won’t Boot

    Read the article

  • SEO - Hidden content before main site content

    - by 0pt1m1z3
    I have a two hidden divs before my main site content, one with the login form and another with the signup form. I then have login and signup buttons within the page that use JQuery to show or hide these divs. I like the effect this setup offers, dropping down from the top of the page and pushing the rest of the content down. However, recently I have been getting serious about SEO and I am wondering if these divs have been affecting my SERP rankings. Basically, every non-logged page (everything bots see) has the same two display:none; divs at the top of the document flow. Is it bad? Should I re-engineer these forms and the way they are displayed?

    Read the article

  • How to figure out recent pagrank of websites or any particular page (Homepage)

    - by rajesh.magar
    Question just comes in front because the very recent published algorithm changes by Google been affected my website traffic. And I've been wondering that my homepage page-rank is been also drop to 6 to 4 (Might be I am not sure). I am not using any supernatural SEO tools like SEOMOZ,Majesctic SEO etc. So it's quite difficult for me to ensure weather the page rank is been really affected or not. So can anyone please provide any good resource, tact or tricks to address this question. Thanks!

    Read the article

  • Do you know any independant keyword(phrase) statistics trend website?

    - by Sam
    Hi all, does anyone know an equally impressive service that shows the amount of times a specific keyword(phrase) has been searched, as well as a branch of other similar words? The one discussed in this video (Wordtracker.com) seems very good, but has gone commercial unfortunately which is not what Im looking for. I really would prefer free tool... http://www.youtube.com/watch?v=H2M1tXtAc18&feature=related Any suggestions for similar free online tools are very welcome. Thanks

    Read the article

  • Can we 301 redirect to a new page, but still publish the old content somewhere else?

    - by KBS
    We have a page on the site which ranks well for an SEO term (top 5) but contains old information. We have added a new page but Google doesn't rank it that well. Information on these pages is time sensitive. Old: example.com/2013-related-information.html New: example.com/2014-related-information.html Obvious solution is to delete old page and do a 301 redirect to the new page. Now, can we still keep the old page by giving it a new URL. example.com/2013-related-information.html is redirected to example.com/2014-related-information.html example.com/2014-related-information.html is recreated with a new address such as example.com/new-2013-related-information.html What we are trying to do is to send the user to the fresh page but still not destroying the record copy if someone wants to go and dig up the old information.

    Read the article

  • Adding tags for SEO in clothing website [duplicate]

    - by samyb8
    This question already has an answer here: What are the best ways to increase a site's position in Google? 18 answers I am building a site for a women's accessories brand. The site has a Homepage, a Store page (where all accessories are displayed), a page for each of the accessories description, an about page and a contact page. There is also a whole set up for shopping cart and checkout (irrelevant to this question). My issue is the SEO. Where can I put the keywords? The home page has only the menu and some photos. The store page displays the items and its titles. Then the specific item's page has a description of the item (pulled from the database), category and price. However, I feel like this is not enough for SEO for google ranking. Where could I add tags in this type of site?

    Read the article

  • SharePoint vire vers le social, le Cloud et le mobile, Microsoft dévoile les nouveautés de la version 2013 et son intégration avec Yammer

    SharePoint vire vers le social, le Cloud et le mobile Microsoft dévoile les nouveautés de la version 2013 et son intégration avec Yammer A l'occasion de la Conférence SharePoint 2012 de Las Vegas, Microsoft dévoile les nouvelles fonctionnalités de SharePoint 2013. Pour cette mise à jour majeure de suite d'outils de Microsoft pour application et portail d'entreprise, Microsoft a effectué d'importants investissements dans le Social, le Cloud et le mobile. Jusqu'ici, Microsoft avait dévoilé peu d'information sur les fonctionnalités sociales de SharePoint inspiré de Yammer. Pour rappel, Yammer est outil permettant la mise en place d'un réseau social interne pour une e...

    Read the article

  • Why are my Google searches redirected?

    - by Please Help
    This machine was infected with various malware. I have scanned the system with Malwarebytes. It found and removed some 600 or so infected files. Now the machine seems to be running well with only one exception. Some Google search results are being redirected to some shady search engines. If I were to copy the url from the Google Search results and paste it in the address bar it would go to the correct site but if I click the link I will be redirected somewhere else. Here is my log file from HijackThis: http://pastebin.com/ZE3wiCrk

    Read the article

  • Fresh start outside Google's crapbox [on hold]

    - by Krzysztof Minister Bytu
    I might have been experimenting with my website too much and Google first cut the flow of visitors considerably and now I didn't get one for 4 days already. It's a joke that they've done this, because I've put a lot of work into it, but that's a topic for another day. My question is about further avoiding it. I want to take the partly improved design from that website onto a new one and get a new domain name. The question is: in that case, do I have to change the hosting option (it has my old website name in the address), or is changing the domain enough for Google to treat it as something new from a "fresh user". In other words, does Google get through the domain address and log into the actual hosting address? I'd hate to waste another few months of hard work, so I prefer to take every possible precaution but not paying for another hosting would make things easier on the wallet.

    Read the article

  • Which tags to use for good SEO on the page

    - by Aaditi Sharma
    I have a event page, where it has the following items. Event Name Venue Name(s) {some cases go upto 5 or more venues} Event Info {Genre(s),Language,type(s)} Date(s) on which the event is. Event Description. Since, the Event name is unique, and present in the title, I am assigning <H1> to it. However, venue names are multiple, plus the same venue may be repeated across the page, along with dates. (Each)Event Info, is used a single time on the page Dates, are descriped in a styled manner using multiple spans, however, I am going to use a title on them. Event description is in <p> tag. So My question is which heading tags to use for a good symentic description and SEO. Also the title on the dates, which format should I keep the date in? (dd/mm/yyyy)?

    Read the article

  • Will we be penalized for having multiple external links to the same site?

    - by merk
    There seem to be conflicting answers on this question. The most relevant ones seem to be at least a year or two old, so I thought it would be worth re-asking this question. My gut says it's ok, because there are plenty of sites out there that do this already. Every major retailer site usually has links to the manufacturer of whatever item they are selling. go to www.newegg.com and they have hundreds of links to the same site since they sell multiple items from the same brand. Our site allows people to list a specific genre of items for sale (not porn - i'm just keeping it generic since I'm not trying to advertise) and on each item listing page, we have a link back to their website if they want. Our SEO guy is saying this is really bad and google is going to treat us as a link farm. My gut says when we have to start limiting user useful features to our site to boost our ranking, then something is wrong. Or start jumping through hoops by trying to hide text using javascript etc Some clients are only selling 1 to a handful of items, while a couple of our bigger clients have hundreds of items listed so will have hundreds of pages that link back to their site. I should also mention, there will be a handful of pages with the bigger clients where it may appear they have duplicate pages, because they will be selling 2 or 3 of the same item, and the only difference in the content of the page might just be a stock #. The majority of the pages though will have unique content. So - will we be penalized in some way for having anywhere from a handful to a few hundred pages that all point to the same link? If we are penalized, what's the suggested way to handle this? We still want to give users the option to go to the clients site, and we would still like to give a link back to the clients site to help their own SE rankings.

    Read the article

  • Have start menu index applications in places other than program files?

    - by user74757
    I love the ability to type any phrase into the start menu, and Windows 7 will bring up relevant executables under Program Files. However, I have a separate folder of my own where I store several portable applications - usually small apps that run straight out of an unzipped folder with little dependency. I have all of these under a specific hard-coded location - "PortableApps", but I would really like to tell the start menu to search this folder as well as Program Files for executables when searching. The search is often clogged because the executables I'm looking for are classified under "Documents" in the search results and buried in other non-related files. Is there a way I can achieve this in Windows 7? Thanks for any suggestions - Chase

    Read the article

  • Change from static HTML file to meta tag for Google Webmaster verification

    - by Wilfred Springer
    I started verifying the server by putting a couple of static HTMLs in place. Then I noticed that Google wants you to keep these files in place. I didn't want to keep the static HTMLs in, so I want to switch to an alternative verification mechanism, and include the meta tags on the home page. Unfortunately, once your site is verified, you never seem to be able to change to an alternative way of verification. I tried removing the HTML pages. No luck whatsoever. Google still considers the site to be 'verified'. Does anybody know how to undo this? All I want to do is switch to the meta tag based method of site ownership verification.

    Read the article

  • How long is the penalty for Duplicate ecommerce content after it has been ressurected

    - by will
    I am fixing all of the duplicate content on my ecommerce site with all orignal descriptions etc. How long does it take google to start ranking it again? I used to have a good ranking that converted quite a few sales, in the last week i have had next to nothing. Also would the disclaimer i created under each product be considered duplicate content because it is on most of my product pages & is the same.

    Read the article

< Previous Page | 155 156 157 158 159 160 161 162 163 164 165 166  | Next Page >