Search Results

Search found 11086 results on 444 pages for 'asynchronous pages'.

Page 213/444 | < Previous Page | 209 210 211 212 213 214 215 216 217 218 219 220  | Next Page >

  • Are the contents in the front page considered as duplicate of the post?

    - by yibe
    I asked this same question on stackoverflow, but closed being off topic. Therefore, I am posting it here. In Wordpress blogs, the front page of the blog will display many posts in whole or excerpts. When the link to the post is clicked, the content will be opened with an other template file(single.php). Can we say that the content displayed in the front page and the post pages are considered as duplicate? Does it harm SEO in any way?

    Read the article

  • Collapsible menu and amount of links in a web page

    - by dstonek
    One of my pages contain three levels of a collapsible menu (JS + CSS from mycssmenu.com). There are a dozen first level items displayed to users, each one with various second level items, and finally a lot of third level items, each one containing a related link. This generates a lot of internal links (300+). Because of SEO should I change the way the collapsible menu is displayed to reduce link amount? What do you suggest? I would like to avoid users to have to open a new page just to only see what are third level items and eventually follow one of its links.

    Read the article

  • What&rsquo;s new in ASP.NET MVC 2 Wrox Blox available for purchase

    My latest book, the Whats new in ASP.NET MVC 2 Wrox Blox, is now available for purchase from the Wrox store at the cost of US $6.99. For those who are not familiar with them, Wrox Blox are short and concise ebooks that cover very specific topics. Ranging from 30 to 70-80 pages, they are a very good option in case you need to solve a specific problem, or learn a specific technology, but you dont to buy a whole book only when you would read a chapter or two. And this ebook is exactly like that:...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Make the Web Fast: Automagic site optimization with mod_pagespeed 1.0!

    Make the Web Fast: Automagic site optimization with mod_pagespeed 1.0! mod_pagespeed is an open-source Apache module that automatically optimizes web pages and resources on them: images, CSS, JavaScript, and much more. In this episode, we'll catch up with Joshua Marantz, the tech lead of the project at Google and talk about the history of mod_pagespeed, its fast growing adoption (130K+ sites!), technical architecture and how it works under the hood. Finally, we'll talk about the upcoming 1.0 release milestone for the project. If you're curious about mod_pagespeed, then this is definitely the show you won't want to miss! From: GoogleDevelopers Views: 2 0 ratings Time: 01:05:06 More in Science & Technology

    Read the article

  • Didn't you have problems with upgrade from 11.10 to 12.04 (libre office)?

    - by Pascal Paulus
    This is the first time I'm repporting something hoping that it can be usefull for you. When updating from 11.010 to 12.04 (what include updating Libre office I supose), I can't any more work with any document that was originally made in Libre office. Every change freezes the screen, I can't save anything... I'm talking of complex documents, with lots of internal references and footnotes and some propor text styles of about 230 pages (phd work) I wanted to alert you that probabely something is wrong but as I don't have any tecnical knowledge, I don't know what could be usefull to help you in your great job of making good free software. My little desktop has 2Gb of Ram memory and an atom processor (I can look for more details if that would be usefull to you)

    Read the article

  • Apress Deal of the day - 5/Feb/2011

    - by TATWORTH
    Today's $10 Deal of the Day from Apress at http://www.apress.com/info/dailydeal is: Pro ASP.NET 4 in C# 2010, Fourth Edition ASP.NET 4 is the latest version of Microsoft's revolutionary ASP.NET technology. It is the principal standard for creating dynamic web pages on the Windows platform. Pro ASP.NET 4 in C# 2010 raises the bar for high-quality, practical advice on learning and deploying Microsoft's dynamic web solution. $59.99 | Published Jun 2010 | Matthew MacDonald I am reviewing this book at the moment but I was already sufficiently impressed by this book to have bought the PDF the day it was available last December.

    Read the article

  • MediaWiki plugin for dynamic content via forms

    - by Geek42
    Are there any plugins for MediaWiki that would allow me to create a page that has a form at the top that when filled in populates tags further down in the document? Say someone would put a form with "Source Server:" and "Destination Server:" fields at the top. Once those were typed in it would automatically populate those names into the content lower so that when following the instructions you could just read the docs and not have to mentally replace things, and possibly mess it all up. I'm not looking to make pages that are permanent, just ones that can have values entered before they are followed. Any suggestions?

    Read the article

  • Powermapper alternatives and google analytics?

    - by rugbert
    Whats a good application to map the hierarchical structure of a website and maybe get some google analytic action going on? Powermapper is pretty expensive and the trial version didnt seem particularly good at importing google analytics CSV files as advertised. In fact, out of the 1000 pages mapped, exactly 0 were successfully imported. I dont really need all the features Powermapper offers anyway, so the price tag is a bit much. All I really need is like, a simple visual representation of my (automatically generated) site structure (like a hierarchical site structure) and the ability to integrate my google analytic stats (page views mostly) with it.

    Read the article

  • Can we 301 redirect to a new page, but still publish the old content somewhere else?

    - by KBS
    We have a page on the site which ranks well for an SEO term (top 5) but contains old information. We have added a new page but Google doesn't rank it that well. Information on these pages is time sensitive. Old: example.com/2013-related-information.html New: example.com/2014-related-information.html Obvious solution is to delete old page and do a 301 redirect to the new page. Now, can we still keep the old page by giving it a new URL. example.com/2013-related-information.html is redirected to example.com/2014-related-information.html example.com/2014-related-information.html is recreated with a new address such as example.com/new-2013-related-information.html What we are trying to do is to send the user to the fresh page but still not destroying the record copy if someone wants to go and dig up the old information.

    Read the article

  • JavaOne India Technical Sessions

    - by Tori Wieldt
    If you’re working with Java technology, it pays to go straight to the source for your information. At JavaOne and Oracle Develop India, you’ll be able to choose from more than 90 sessions, hands-on labs, keynotes, and demos delivered by today’s most knowledgeable Java experts. You'll also hear the most up-to-date information on current releases and future directions of Java standards and technologies, and see the latest Java developer tools and solutions. Register now! Technical sessions include: Project Lambda: To Multicore and Beyond Introduction to JavaFX 2.0 GlassFish REST Administration Back End: An Insider Look at a Real REST Application Java-Powered Home Gateway: Basis of the Next-Generation Smart Home Mobile Java Evolution Cloud-Enabled Java Persistence Visit the JavaOne India web pages for a complete list of conference sessions. See you there!

    Read the article

  • How to track opens and pageviews in PDFs?

    - by Osvaldo
    I know how to track clicks in links to pdfs and pfd's downloads. But I need to track how many times a PDF is opened after being downloaded and if possible track how many times certain pages are shown to users. Tracking has to be done without warnings that personal information is being sent somewhere. I do not want readers personal informations, just to know how many opens happened, so this warnings would be inaccurate. Can anyone help by pointing to a tutorial or an example? If you are sure that this can't be done, can you please point to documentation that explains why?

    Read the article

  • Getting through a lengthy book?

    - by Mr_Spock
    This may seen like a weird question, but since we're challenged--as engineers--to constantly adapt to changing technologies, we always find ourselves buried in documentation. That said, we also need to consider that time is of the essence because people want their stuff fixed and improved with little hesitation if any. How do you get through lengthy manuals, books/manuals within a short period of time? Take for example: "The Linux Programming Interface," by Michael Kerrisk, which is roughly 1500 pages in length. How would you get through a monster of a book like this if you're pressed for time while still learning most of the material?

    Read the article

  • TechEd 2014 Day 4

    - by John Paul Cook
    Many people visiting the SQL Server booth wanted to know how to improve performance. With so much attention being given to COLUMNSTORE and in-memory tables and stored procedures, it is easy to overlook how important tempdb is to performance. Speeding up tempdb I/O improves performance. The best way to do this is to not do the I/O in the first place. With SQL Server 2014, tempdb page management is smarter. Pages are more likely to be released before being unnecessarily flushed to disk. Read more about...(read more)

    Read the article

  • How should I track multi-valued page attributes (e.g. tags) using custom variables?

    - by Simon
    Our pages can each have many tags, e.g 'football', 'sms', 'nsfw', etc.. wich we would like to track in google analytics. We're already tracking things like category using google analytics custom variables. We've used three of the five available slots so far. How can we track tags the same way? If we just mush them all together - e.g. 'football, sms, nsfw' then can we track the ones that are tagged 'football'? What's the right way to track multi-valued page attributes using custom variables?

    Read the article

  • category title and affect on SEO and ranking [closed]

    - by Mark
    We are working on a jobs and skills website (similiar to Skill Pages) and are deciding on the names of categories. Rather than having loads of categories and sub-categories like, for example, Builder, Electrician, Carpenter etc, we would like to have more general and easier on the eye category names. So for example we have House, Computer, Education, Art etc. So a builder would be in category House and a few others. Will this style negatively effect our SEO and ranking? And if so, should we abandon and go back to traditional categories and sub-categories?

    Read the article

  • Les pirates utilisent de plus en plus les sites légitimes pour leurs exploits, révèle un rapport de Kaspersky Lab

    Les pirates utilisent de plus en plus les sites légitimes pour leurs exploits, révèle un rapport de Kaspersky Lab Kaspersky Lab vient de publier ses dernières observations sur l'évolution des menaces de sécurité informatique. Il y est mis en lumière une hausse des attaques en ligne en 2010, avec plus de 580 millions d'incidents détectés. Et une nouvelle tendance s'est faite remarquer : les risques ne planaient plus seulement au dessus des sites proposant des contenus illégaux, mais aussi du côté des pages légitimes (comme les sites de shopping ou de jeu en ligne), que les cyber-criminels prennent de plus en plus à parti. En général, ces derniers s'attaquent à des serveurs vulnérables, et injectent un code malveillant...

    Read the article

  • Common light map practices

    - by M. Utku ALTINKAYA
    My scene consists of individual meshes. At the moment each mesh has its associated light map texture, I was able to implement the light mapping using these many small textures. 1) Of course, I want to create an atlas, but how do you split atlases to pages, I mean do you group the lm's of objects that are close to each other, and load light maps on the fly if scene is expected to be big. 2) the 3d authoring software provides automatic uv coordinates for each mesh in the scene, but there are empty areas in the texel space, so if I scale the texture polygons the texel density of each face wil not match other meshes, if I create atlas like that there will be varying lm resolution, how do you solve this, just leave it as it is, or ignore resolution ? Actually these questions also applies to other non tiled maps.

    Read the article

  • SEO: Is promoting your backlinks a good strategy for improving search results for my site's name?

    - by user4394
    I run a website that's been around for about three years in the sports space. I am successfully ranking well for targeted keywords, but searching for the name of my site itself returns very poor results - it shows my site, its FB/Twitter, and then 15 pages of unrelated spam that happen to contain two words that, when combined, form my website's name. After that, my backlinks begin to show up spordically. As far as I can tell, I simply don't have enough backlinks and the backlinks I do have are ranked worse than the spam. (Site Explorer lists 200 external links to any page on our domain and 20 external links directly to the front page). To counter this, my strategy is to promote my backlinks so they get a better page rank than the spam. Does that make sense? Am I going in the right direction or should I just focus on getting more backlinks pointing directly to my site? Thanks in advance and I'd be happy to answer any questions I can (without giving away my site of course).

    Read the article

  • Serverless Web Application

    - by Andrea Di Persio
    In my company we work on a software that produce reports in html format. My bosses love the fact that static html pages can be moved across computer simply by moving/copying a folder and no web server is involved, so the customer only need a browser. The problem is that they asking me to implement a lot of feature which is very hard to implement properly and in a clean way without an application server. Frames cross domain problem, the impossibility to work with GET and POST data, no URLs routing...is very hard to work with this limitations. Anyone had similiar experience and wants to share their tricks/suggestion ? Do I need to tell my boss 'there is no future without a web server'? Regards.

    Read the article

  • How to enable a Web portal-based enterprise platform on different domains and hosts without customization [on hold]

    - by S.Jalali
    At Coscend, a cloud and communications software product company, we have built a Web portal-based collaboration platform that we like to host on five different Windows- and Linux-based servers in different hosting environments that run Web servers. Each of these Windows and Linux servers has a different host name and domain name (and IP address). Our team would appreciate your guidance on: (1) Is there a way to implement this Web portal-based platform on these Linux and Windows servers without customizing the host name, domain name and IP address for each individual instance? (2) Is there a way to create some variables using JavaScript for host name and domain name and call them from the different implementations? If a reference to the host/domain names occurs on hundreds of our pages, the variables or objects would replace that. (3) This is part of making these JavaScript modules portable and re-usable for different environments and instances. The portal is written in JavaScript that is embedded in HTML5 and padded with CSS3. Other technologies include Flash, Flex, PostgreSQL and MySQL.

    Read the article

  • Find out when a new domain appears in search results

    - by TerryB
    Does anyone know a way to perform the following: I want to know whenever a new domain starts appearing in the google search results for a particular query. For a given google search query, I'd like to receive an alert whenever a new domain pops up and starts appearing in the search results for that query. Alternatively, it would be great if you could just sort google search results by the age of the domain, making it easy to find new sites. As far as I can tell you can only sort by when the page was "last updated". Is something like this possible? EDIT: Following John's suggestion of Google Alerts. The problem with Google Alerts is that it sends you any new PAGES appearing in the search results, not just new DOMAINS.

    Read the article

  • Javascript Only Search Method [on hold]

    - by user2118228
    I need to put a search function on a website that is going to be on a CD-ROM with no access to the internet. It has 80 pages, and about 500 'items', so I'd prefer to not have to hard code 100's of 'if statements if possible. I've found a few programs you can buy that will index and generate results (Zoom Search, JSS Index, The German Guys') but there are odd quirks with each one. Plus I would rather code it myself to get complete control over it, and to really understand what it's doing. Basically searching for a few words would display the product image and description; clicking on that would take you the related URL. This is kind of complicated, I can't find an easy solution not dealing with hundreds of if Statements. Has anyone ever created anything like this or know a better method? I'm not really sure a better way to go about this. I've used PHP/MYSQL for search results before, but this cannot run any php.

    Read the article

  • Change from static HTML file to meta tag for Google Webmaster verification

    - by Wilfred Springer
    I started verifying the server by putting a couple of static HTMLs in place. Then I noticed that Google wants you to keep these files in place. I didn't want to keep the static HTMLs in, so I want to switch to an alternative verification mechanism, and include the meta tags on the home page. Unfortunately, once your site is verified, you never seem to be able to change to an alternative way of verification. I tried removing the HTML pages. No luck whatsoever. Google still considers the site to be 'verified'. Does anybody know how to undo this? All I want to do is switch to the meta tag based method of site ownership verification.

    Read the article

  • What technology(s)would be suitable for the front end part of a Java web game?

    - by James.Elsey
    As asked in a previous question, I'm looking to create a small MMO that will be deployed onto GAE. I'm confused about what technologies I could use for the user interface, I've considered the following JSP Pages - I've got experience with JSP/JSTL and I would find this easy to work with, it would require the user having to "submit" the page each time they perform an action so may become a little clumsey for players. Applet - I could create an applet that sits on the front end and communicates to the back end game engine, however I'm not sure how good this method would be and have not used applets since university.. What other options do I have? I don't have any experience in Flash/Flex so there would be a big learning curve there. Are there any other Java based options I may be able to use? My game will be text based, I may use some images, but I'm not intending to have any animations/graphics etc Thanks

    Read the article

  • Are there Any Concerns with Importing Document Files From a Competing Product?

    - by Thunderforge
    I have a new product that serves the same purpose as my competitor's long-standing product. One thing I have considered doing is allowing my program to import document files created by their product in order to provide an easy way for users to migrate towards mine. Naturally, this would be done without the competitor's permission, as it goes against their interests. I've seen this done before with office suite software (e.g. Open Office and Apple Pages can import MS Word documents), but I'm wondering if there are any concerns, legal or ethical, with me doing this. I fully expect any answers will most likely fall under the "I am not a lawyer" clause, but it would be helpful to have a starting point for anything I would need to be aware of, or if I shouldn't need to worry.

    Read the article

< Previous Page | 209 210 211 212 213 214 215 216 217 218 219 220  | Next Page >