Search Results

Search found 24498 results on 980 pages for 'lock pages in memory'.

Page 269/980 | < Previous Page | 265 266 267 268 269 270 271 272 273 274 275 276  | Next Page >

  • Run your own XHTML validator

    - by TATWORTH
    Whilst the W3C do provide an excellent service for manually checking your web pages, there are times when an alternative is required. There is for example a web service at http://validator.w3.org/docs/api.html  This can be for programmatically checking your pages (provided you make no more than 1 call per second). The W3C do provide all the source code needed to run your own validation service. Get the full details at: ·         Installation and development information for the W3C Markup Validator   http://validator.w3.org/docs/devel.html ·         Source Availability http://validator.w3.org/source/

    Read the article

  • Google Page Rank - The Ultimate Popularity Contest

    Although you might initially think of it as simply the way that the Google search engine ranks pages, the term Google Page Rank is actually a trademarked term that actually belongs to Stanford University. The term is a tribute to its creator, Larry Page, and refers to a complex mathematics algorithm that allows today's advanced search engines, like Google, to index and rank the millions and millions of pages that exist on the internet today.

    Read the article

  • META Tags in SEO - Should You Use Them?

    Onsite SEO is going by the wayside and not being used by the search engines much anymore but it may still benefit you to optimize each of your web pages. The search engines have a formula to determine keyword relevancy on each page of your web site. The technical term is an algorithm, which each engine has its own unique algorithm that it uses to rank pages.

    Read the article

  • SEO Services

    With all of the social networking websites popping up all over the internet, many are afraid that all of these new pages will make it increasingly harder for one to get his or her website noticed. This may be the case considering that new people are creating social network web pages at the rate of about one per minute.

    Read the article

  • What's the best way to cache a growing database table for html generation?

    - by McLeopold
    I've got a database table which will grow in size by about 5000 rows a hour. For a key that I would be querying by, the query will grow in size by about 1 row every hour. I would like a web page to show the latest rows for a key, 50 at a time (this is configurable). I would like to try and implement memcache to keep database activity low for reads. If I run a query and create a cache result for each page of 50 results, that would work until a new entry is added. At that time, the page of latest results gets new result and the oldest results drops off. This cascades down the list of cached pages causing me to update every cache result. It seems like a poor design. I could build the cache pages backwards, then for each page requested I should get the latest 2 pages and truncate to the proper length of 50. I'm not sure if this is good or bad? Ideally, the mechanism I use to insert a new row would also know how to invalidate the proper cache results. Has someone already solved this problem in a widely acceptable way? What's the best method of doing this? EDIT: If my understanding of the MYSQL query cache is correct, it has table level granularity in invalidation. Given the fact that I have about 5000 updates before a query on a key should need to be invalidated, it seems that the database query cache would not be used. MS SQL caches execution plans and frequently accessed data pages, so it may do better in this scenario. My query is not against a single table with TOP N. One version has joins to several tables and another has sub-selects. Also, since I want to cache the html generated table, I'm wondering if a cache at the web server level would be appropriate? Is there really no benefit to any type of caching? Is the best advice really to just allow a website site query to go through all the layers and hit the database every request?

    Read the article

  • systemstate dump ??

    - by JaneZhang(???)
            ???????????????hang????,????????systemstate dump?????????,?????,????????,???????????????,????systemstate dump?????????????       ??????,????????systemstate dump, ?????“WAITED TOO LONG FOR A ROW CACHE ENQUEUE LOCK”?        systemstate dump???????????,??????:??????,???????,????dump????????,???????M????)1. ?sysdba???????:$sqlplus / as sysdba??$sqlplus -prelim / as sysdba <==??????????hang?????SQL>oradebug setmypidSQL>oradebug unlimit;SQL>oradebug dump systemstate 266;?1~2??SQL>oradebug dump systemstate 266;?1~2??SQL>oradebug dump systemstate 266;SQL>oradebug tracefile_name;==>????????2. ????systemstate dump,??????hang analyze??????????????????$sqlplus / as sysdba??$sqlplus -prelim / as sysdba <==??????????hang?????SQL>oradebug setmypidSQL>oradebug unlimit;SQL>oradebug dump hanganalyze 3?1~2??SQL>oradebug dump hanganalyze 3?1~2??SQL>oradebug dump hanganalyze 3SQL>oradebug tracefile_name;==>??????????RAC???,????????????systemstate dump,???????????(?????????):$sqlplus / as sysdba??$sqlplus -prelim / as sysdba <==??????????hang?????SQL>oradebug setmypidSQL>oradebug unlimitSQL>oradebug -g all dump systemstate 266  <==-g all ??????????dump?1~2??SQL>oradebug -g all dump systemstate 266?1~2??SQL>oradebug -g all dump systemstate 266?RAC???hang analyze:SQL>oradebug setmypidSQL>oradebug unlimitSQL>oradebug -g all hanganalyze 3?1~2??SQL>oradebug -g all hanganalyze 3?1~2??SQL>oradebug -g all hanganalyze 3?????????????????systemstate dump,?????????????backgroud_dump_dest??diag trace???????????????????????????,?????hang?,?????systemstate dump?????:10:   dump11:   dump + global cache of RAC256: short stack (????)258: dump(???lock element) + short stack (????)266: 256+10 -->short stack+ dump267: 256+11 -->short stack+ dump + global cache of RAClevel 11? 267? dump global cache, ??????trace ??,??????????????,????????,???266,??????dump?????????,????????????????????short stack????,???????,??2000???,??????30??????????,????level 10 ?? level 258, level 258 ? level 10????short short stack, ??level 10?????lock element data.?????systemstate dump???,??????level?????:??????37???:-rw-r----- 1 oracle oinstall    72721 Aug 31 21:50 rac10g2_ora_31092.trc==>256 (short stack, ????2K)-rw-r----- 1 oracle oinstall  2724863 Aug 31 21:52 rac10g2_ora_31654.trc==>10    (dump,????72K )-rw-r----- 1 oracle oinstall  2731935 Aug 31 21:53 rac10g2_ora_32214.trc==>266 (dump + short stack ,????72K)RAC:-rw-r----- 1 oracle oinstall 55873057 Aug 31 21:49 rac10g2_ora_30658.trc ==>11   (dump+global cache,????1.4M)-rw-r----- 1 oracle oinstall 55879249 Aug 31 21:48 rac10g2_ora_28615.trc ==>267 (dump+global cache+short stack,????1.4M) ??,??????dump global cache(level 11?267,???????????????)??????????,?????????systemstate dump ??

    Read the article

  • Are we queueing and serializing properly?

    - by insta
    We process messages through a variety of services (one message will touch probably 9 services before it's done, each doing a specific IO-related function). Right now we have a combination of the worst-case (XML data contract serialization) and best-case (in-memory MSMQ) for performance. The nature of the message means that our serialized data ends up about 12-15 kilobytes, and we process about 4 million messages per week. Persistent messages in MSMQ were too slow for us, and as the data grows we are feeling the pressure from MSMQ's memory-mapped files. The server is at 16GB of memory usage and growing, just for queueing. Performance also suffers when the memory usage is high, as the machine starts swapping. We're already doing the MSMQ self-cleanup behavior. I feel like there's a part we're doing wrong here. I tried using RavenDB to persist the messages and just queueing an identifier, but the performance there was very slow (1000 messages per minute, at best). I'm not sure if that's a result of using the development version or what, but we definitely need a higher throughput[1]. The concept worked very well in theory but performance was not up to the task. The usage pattern has one service acting as a router, which does all reads. The other services will attach information based on their 3rd party hook, and forward back to the router. Most objects are touched 9-12 times, although about 10% are forced to loop around in this system for awhile until the 3rd parties respond appropriately. The services right now account for this and have appropriate sleeping behaviors, as we utilize the priority field of the message for this reason. So, my question, is what is an ideal stack for message passing between discrete-but-LAN'ed machines in a C#/Windows environment? I would normally start with BinaryFormatter instead of XML serialization, but that's a rabbit hole if a better way is to offload serialization to a document store. Hence, my question. [1]: The nature of our business means the sooner we process messages, the more money we make. We've empirically proven that processing a message later in the week means we are less likely to make that money. While performance of "1000 per minute" sounds plenty fast, we really need that number upwards of 10k/minute. Just because I'm giving numbers in messages per week doesn't mean we have a whole week to process those messages.

    Read the article

  • System crashed while upgrading, now unable to recover. Please help

    - by longloop
    Yesterday while upgrading from 11.4 to 11.10, due to power cut my system crashed. Ubuntu is unable to boot. While booting I get two options in grub, one is for recovery mode and the other is for normal ubuntu booting (and others are also thr for mem check and booting to windows). When I try to boot it normally, it pauses booting while displaying ' Checking battery statues' . Though i am not using a laptop, I am on a desktop. And In recovery mode the menu has 4 options - resume boot, fsck , remount and root ( to goto shell prompt) . If I go to shell and type ' apt-get dist upgrade' , it shows - W: not using locking for read only lock file /var/lib/dpkg/lock E:unable to write to /var/cache/apt E:The package lists or status files could not be parsed or opened. Please instruct me to recover from this situation.

    Read the article

  • Search Engine Optimization With PHP

    PHP pages have a reputation of being more or less different to SEO than static HTML pages. Sometimes these questions come to mind of many webmasters. If I use PHP for my developing website will it be SEO compatible? And in PHP if use post method then will it be a problem? I mean the search engine spiders won't be trapped?

    Read the article

  • Session locked, Kubuntu 12.10

    - by user101815
    After leaving my laptop for a while, my Kubuntu session closed. (I'm not sure which of the various timeout criteria caused it.) So I got a screen that said my session was locked and I needed to provide my password to unlock it. That's a nuisance, not a big problem, but I'd like to fix it so timed-out sessions don't lock the session. I assume there's something in System Settings to disable the lockout -- but where? I looked in Power Management and found a setting "Lock screen on resume", but it's unchecked. Is this possibly a bug?

    Read the article

  • No Webcam Device

    - by Aliyah
    deeva@androliyah-A6200:~$ sudo lshw -C video [sudo] password for deeva: *-display description: VGA compatible controller product: Core Processor Integrated Graphics Controller vendor: Intel Corporation physical id: 2 bus info: pci@0000:00:02.0 version: 02 width: 64 bits clock: 33MHz capabilities: msi pm vga_controller bus_master cap_list rom configuration: driver=i915 latency=0 resources: irq:43 memory:f0000000-f03fffff memory:e0000000-efffffff ioport:e080(size=8) deeva@androliyah-A6200:~$ How do I get my webcam to work?

    Read the article

  • Costs to Design and Manage a Website

    The problem with most "link building" services it that they spam you content across the internet. Also most if not all of the links/ pages are not indexed. If the pages are not indexed or re-indexed then the search engine i.e. Google does not know that the link is there.

    Read the article

  • Backlinks for nonexisting page

    - by Michal
    I've bought domain, which was previously used by somebody back in 2007. Now I've realized that internet is full of backlinks that point to non-existing parts (pages) under my website-domain (for example to mypage.com/whatever, where whatever is not present on my website, so 404 error shows). I want to ask, are these links counted by google (for pagerank) and other search engines, or not. So do I have to redirect these links to existing pages in order to be counted?

    Read the article

  • Legitimate SEO Services

    SEO is symbol of search engine optimization which is the key to success in the business. No website has meaning if it is not properly promoted. Whenever any internet surfer is in search of any specific product, services or information he uses the simplest way of searching through search engine and this is habit of many people to only look in to five or six top websites for their purpose. No one has time to look in to 100 pages of search engine as there is no need when he finds in top pages.

    Read the article

  • What Constitutes Offsite Web Optimization?

    Off-page optimization is about getting links to your pages. There are many ways how you can get them but the most valuable links you can get from websites where their webmasters will naturally link to your pages without any intervention.

    Read the article

  • The Power of a Sitemap

    You have a website and would like search engine bots to index it. This is because you would like to higher your standings in search engine results. The only thing that would seem reasonable to do is to create back links either in the pages or script to enhance the pages indexing.

    Read the article

  • Multiple Denial of Service (DoS) vulnerabilities in libxml2

    - by chandan
    CVE DescriptionCVSSv2 Base ScoreComponentProduct and Resolution CVE-2011-3905 Improper Restriction of Operations within the Bounds of a Memory Buffer vulnerability 5.0 libxml2 Solaris 11 Contact Support Solaris 10 SPARC: 125731-07 X86: 125732-07 Solaris 9 Contact Support CVE-2011-3919 Improper Restriction of Operations within the Bounds of a Memory Buffer vulnerability 7.5 This notification describes vulnerabilities fixed in third-party components that are included in Sun's product distribution.Information about vulnerabilities affecting Oracle Sun products can be found on Oracle Critical Patch Updates and Security Alerts page.

    Read the article

  • Exadata X3 launch webcast - Available on-demand

    - by Javier Puerta
    Available on-demand, this webcast covers everything partners need to know about Oracle’s next-generation database machine. You will learn how to improve performance by storing multiple databases in memory, lower power and cooling costs by 30%, and easily deploy a cloud based database service. Exadata X3 combines massive memory and low-cost disks to deliver the highest performance at the lowest cost. View here!

    Read the article

  • Save and Run Programs From USB

    - by UbuntuRob
    At the moment I am running Ubuntu 12.10 from a USB memory stick, and I wondered whether it would be possible to save and run any programs I have downloaded from another USB stick instead of the one with the operating system on. I'd like to be able to set the download location in Ubuntu Software Center to the second memory drive, but i don't know how to do this. I can keep the operating system on the one USB and the programs on the other USB, which makes everything much more easier. Any ideas?

    Read the article

  • Judgment Calls in SEO Add Up to Results

    The titles and descriptions seen above the URLs on search engine results pages are taken by the search engines from the Meta data of the pages at first until other options are planted in directories during an SEO campaign. If a site has no meta description and no SEO content out there on the Web, the search engine selects some relevant snippets of content from somewhere on the site. The answer to the query may be there; if not the searcher will have to access the site and look for the information.

    Read the article

  • What is Deep Linking and How is it Useful For My Website?

    People talking about deep linking, well I don't honestly see it a lot, but it's definitely very powerful, when you can understand how/why it works and how you can create deep backlinks, you'll be able to understand how to improve the traffic you get from Google, but ranking for more keywords. Deep link is more or less building backlinks to inner pages of your website, for example instead of just building backlinks for your index/domain, you should really build backlinks to your inner pages as well.

    Read the article

  • What is Landing Page Optimisation?

    Landing pages are the first page that you land on when you enter a website. Therefore landing page optimisation is a process of improving pages to better relate to visitor searches and related expectations.

    Read the article

  • Multiple vulnerabilities in International Components for Unicode (ICU)

    - by chandan
    CVE DescriptionCVSSv2 Base ScoreComponentProduct and Resolution CVE-2011-2791 Improper Restriction of Operations within the Bounds of a Memory Buffer vulnerability 7.5 International Components for Unicode (ICU) Solaris 10 SPARC: 119810-07 X86: 119811-07 Solaris 11 11/11 SRU 11.4 CVE-2011-4599 Improper Restriction of Operations within the Bounds of a Memory Buffer vulnerability 7.5 This notification describes vulnerabilities fixed in third-party components that are included in Oracle's product distributions.Information about vulnerabilities affecting Oracle products can be found on Oracle Critical Patch Updates and Security Alerts page.

    Read the article

  • Getting Your Website Noticed on Google

    The challenge to get to the front of search engine result involves efforts from a website owner to constantly improve page content. Yahoo and Bing use HTML tag structures, but Google optimization is more complex. Google separates affiliate pages and ad sites from sites that offer unique, relevant content. As Google has more rigid rules and requirements, site owners have to optimize pages to improve their Google index rank.

    Read the article

< Previous Page | 265 266 267 268 269 270 271 272 273 274 275 276  | Next Page >