Search Results

Search found 38010 results on 1521 pages for 'page curl'.

Page 808/1521 | < Previous Page | 804 805 806 807 808 809 810 811 812 813 814 815  | Next Page >

  • Multiple vulnerabilities fixed in Java 7U9

    - by RitwikGhoshal
    CVE DescriptionCVSSv2 Base ScoreComponentProduct and Resolution CVE-2012-5086 10.0 Java 7 Solaris 11.1 10/12 SRU 2.5 CVE-2012-5083 10.0 CVE-2012-5087 10.0 CVE-2012-1533 10.0 CVE-2012-1532 10.0 CVE-2012-1531 10.0 CVE-2012-5076 10.0 CVE-2012-3143 10.0 CVE-2012-5088 10.0 CVE-2012-5089 7.6 CVE-2012-5084 7.6 CVE-2012-3159 7.5 CVE-2012-5068 7.5 CVE-2012-4416 6.4 CVE-2012-5074 6.4 CVE-2012-5071 6.4 CVE-2012-5069 5.8 CVE-2012-5067 5.0 CVE-2012-5070 5.0 CVE-2012-5075 5.0 CVE-2012-5073 5.0 CVE-2012-5079 5.0 CVE-2012-5072 5.0 CVE-2012-5081 5.0 CVE-2012-3216 2.6 CVE-2012-5077 2.6 CVE-2012-5085 0.0 This notification describes vulnerabilities fixed in third-party components that are included in Oracle's product distributions. Information about each CVE can be found on Java SE Critical Patch Update - October 2012 Information about vulnerabilities affecting Oracle products can be found on Oracle Critical Patch Updates and Security Alerts page.

    Read the article

  • How can I prevent people from looking at a listing of files in parent directory if I haven't uploaded index.html? [closed]

    - by LedZeppelin
    Possible Duplicate: How to restrict the download of all files in a folder? I haven't uploaded index.html or index.php to my root directory. How can I prevent people from looking at a listing of files in parent directory? http://oi56.tinypic.com/sc739e.jpg Also, is it possible for people to obtain a list of all the files in the root directory once I upload index.html? I'm currently using .htaccess and htusers to prompt someone to enter a username and password when they try to access any file in the root directory. This may sound like a weird request but would it be possible to have them come to the site (without an index.html) and just have them not see the files? All it would say on the page would be the following: Index of/ Apache Server at mysite.com Port 80

    Read the article

  • Multiple style sheets best practice

    - by user1145927
    I currently am working on a project which has one large style sheet for about 20 pages. The style sheet contains some styles which are specific for certain pages. I'd like to break the style sheet up so there is one style sheet for each page with one master style sheet that handles everything generic. The reason I want to do this is so people can work on multiple pages without having to worry about who has that large style sheet checked out (I'm using TFS). Is this good practice?

    Read the article

  • Is there an open source solution that I can host on a web server that will allow users to anonymously upload a file to me?

    - by mjn12
    I'm looking for some kind of web application I can host on my Linux web server that will allow users to upload files of arbitrary size to me from their browser without requiring them to log in. Ideally this application would allow me to generate a link to my website that allowed for a one-time use upload. It might contain a unique, random key that was only good for that session. I could email them the link, they click it and are taken to a page where they can upload their file to me. I'm mainly targeting friends and family that need to send me files that are too large for email. I don't want to require them to install anything (dropbox), sign up and log in, etc. I'm definitely not teaching them to use FTP. This wouldn't be a difficult project for me to roll on my own but I'd like to take something off the shelf if it is possible. Does anything like this exist that my google-foo isn't turning up?

    Read the article

  • Website with over 1 million posts with not much textual content

    - by Far Se
    I've made a website which crawls files from all over the Internet and I feel like Google will ban me if I sent it sitemaps which contain all of these pages (1m+), because they contain only the file name/size/no of downloads and the download link(s). I'm considering this thought because I've made another website like this in the past and Google banned me after one week with the reason: "spam", even it was not (maybe somebody falsely reported me?!). Does someone have an idea about how to keep Google form banning my website? I've seen several other sites like mine and they don't get banned or... anything. And also, should I sent the sitemap or wait until Google indexes every page as it finds them? Thanks in advance :)

    Read the article

  • Where would you start if you were trying to solve this PDF classification problem?

    - by burtonic
    We are crawling and downloading lots of companies' PDFs and trying to pick out the ones that are Annual Reports. Such reports can be downloaded from most companies' investor-relations pages. The PDFs are scanned and the database is populated with, among other things, the: Title Contents (full text) Page count Word count Orientation First line Using this data we are checking for the obvious phrases such as: Annual report Financial statement Quarterly report Interim report Then recording the frequency of these phrases and others. So far we have around 350,000 PDFs to scan and a training set of 4,000 documents that have been manually classified as either a report or not. We are experimenting with a number of different approaches including Bayesian classifiers and weighting the different factors available. We are building the classifier in Ruby. My question is: if you were thinking about this problem, where would you start?

    Read the article

  • Advantages of country TLD vs. .com

    - by Tschareck
    I want to get a domain for my site. The site's topic would be about Vienna, but the content will be in English. I was thinking, if I should get .com domain or .at domain. .at is both much cheaper and easier to get (there is less chance that my desired phrase is already registered). Is there any disadvantage in terms of SEO and page rank, if my domain does not end with .com? The site will be in English and targeted not just for Austria, but globally, mostly foreign tourists. I don't care if it's easy to remember the address, I expect most traffic to be from search engines anyway.

    Read the article

  • Managing Confidence

    - by andyleonard
    Introduction This post is the fifty-third part of a ramble-rant about the software business. The current posts in this series can be found on the series landing page . This post is about inspiring others. Hot Chicks - Baby chickens beneath a warming lamp… </NonSubtleSEOPloy> For those who do not know, we raise chickens that laying eggs – referred to as “laying hens”. Natural attrition has taken our flock of laying hens to 11, plus one rooster. We recently received an order of new chicks (pictured...(read more)

    Read the article

  • Is it possible to use binary nvidia driver with GeForce 7300 SE?

    - by jrennie
    I have an Nvidia GeForce 7300 SE. It worked fine with the nvidia driver when I was using 10.10. When I upgraded to 12.04, the (nvidia-current) binary driver failed---I couldn't even get a login screen. The "nouveau" driver works okay, but the display is quite sluggish. I've read about the fact that my GeForce is blacklisted (here and here). But, when I tried the suggested workaround of using nvidia-173, I discovered that it wouldn't install because of a failed dependency: xorg-video-abi-10 (Package not available). The "precise" nvidia-173 package page notes this (dependency bug?) So, my real question: is there a GeForce 7300 SE workaround for 12.04?

    Read the article

  • Are HTTP requests cached? [closed]

    - by nischayn22
    Many HTTP requests are sent repeatedly by browsers on almost every page load, such as requesting the jQuery .js file etc. Since these are already used on too many sites doesn't modern browsers keep a cache for this? I am thinking of a system where the browser has a cached copy of the .js file used very very frequently. On a new request for the .js file, it sends the server a request for a hash of the .js file (provided the server can reply to that) and compares the returned hash with the cached copy's hash... rest is intuitive.

    Read the article

  • robots.txt dissalow url containing string with a '/' at the end

    - by thanili
    i have a website with thousands of dynamic pages. I want to use the robots.txt file in order to dissalow certain url patterns corresponding to pages with duplicate content. For example i have a page for article itemA belonging to category catA/subcatA, with URL: /catA/subcatA/itemA this is the URL that i want to be indexed from google. this article is also visible via tagging in various other places in the web site. The URLs produced via tagging is like: /tagA1/itemA this URL i want NOT to be indexed from google. However i want to have indexed all tag listings: /tagA1 so how can i achieve this? dissalow URLs of including a specific string with a '/' at the end? /tagA1/ itemA - dissalow /tagA1 - allow

    Read the article

  • Un bug de la barre de recherche de Chrome 20 fait croire à un malware, comment y remédier

    Un bug de la barre de recherche de Chrome 20 fait croire à un malware Comment y remédier Depuis sa version 20, Chrome contient un bug assez ennuyeux : la barre de recherche mène automatiquement sur une page blanche. Le 1er réflexe de nombreux utilisateurs a été de penser que ce comportement était dû à un nouveau malware. En fait, il n'en est rien. Si vous faîtes partie de ceux qui se sont inquiétés, rassurez vous. La solution temporaire est très simple ? bien que laborieuse. Il suffit de supprimer "blank.html" de l'URL générée. Cette solution a été trouvée après 116 messages sur le Google Group dédié au problème. [IMG]http://ftp-developpez...

    Read the article

  • How to create website shortcuts on the desktop / in a folder using Chrome?

    - by it's me
    Help with something really basic, which I am unable to figure out. In Windows creating a shortcut (link) for a website is as easy as dragging-and-dropping the favicon/address bar to the desktop or a folder. I tried the same in Ubuntu (Chrome browser), but it's not working. The web page is being saved as a file, but not as a link/shortcut. Am I missing something or is there no way to quickly create shortcuts to web pages/web sites without installing some app for that? If the above is true, is there an app that does what I need? I hope I am clear enough.

    Read the article

  • SEO problem for site with 2 domains [closed]

    - by Harry
    Possible Duplicate: What is duplicate content and how can I avoid being penalized for it on my site? I have two domains pointing to the same site. I want both domains to co-exist, they share most of the same content, but they differ in design and they are aimed at different markets / rivaling communities. Is there a way to let google know that these two domains are the same site and don't cause me to get hit with a duplicate content penalty? Any other general SEO tips for this situation would also be welcomed. Thanks. Come on man, why was this closed. The linked page is completely irrelevant for me.

    Read the article

  • JVM Language Summit 2012 - Registration Open

    - by arungupta
    The 2012 edition of the JVM Languages Summit is Jul 30 - Aug 1, at Oracle's Santa Clara Campus. This is an "an open technical collaboration among language designers, compiler writers, tool builders, runtime engineers, and VM architects". There are presentations, workshops, and lightning talks. About 70 language and VM implementers attended last year and the talks were recorded. Some videos from last year are available here. Check out the Main Page, the Agenda, Logistics, and the Wiki. See the Registration Online; for questions, send mail to inquire AT jvmlangsummit.com.

    Read the article

  • Help with a CMS for content only not display

    - by user2091756
    Hello I'm trying to make some kind of tool for an school website, what I need to do is to make students take a test and according to what are the results (27 posibilities) they get a set of activities (questions) according to their level which they can solve in around 3 months logging periodically to the website, plus I need teachers to log and look at the reports. Now, I'm a graphic designer myself so my skills are mostly html5 and css3 and I know some php (edit existing ones only) and javascript (jquery) as well, most people tell me that I need a CMS to do the tool but all I find is CMS for display like blogs or news websites which I think aren't useful for me because the website is already made in html and css3 only (I need to add an extra page for the tool) I understand I need to create users and give them special rights according to what type of user they are and I also understand that I need a database where I can store all my questions. What is the best way to do this? what do you suggest me? Thanks

    Read the article

  • Does google see the output of document.write?

    - by merk
    I've got a site where people can list machinery for sale. Each item for sale has it's own dynamic page. On each of these pages we allow the person selling the item to have a link back to their own website. Some people only sell a handful of items and some people are selling dozens or hundreds of items. So in some cases we can have a 100 links back to their external site. Our SEO guy is saying this is bad (i'll open another question on that). So i was wondering if i take the links and spit them out using document.write, will that hide them from google and the other SE's ?

    Read the article

  • How to recover organic position in Google results after server down?

    - by ElHaix
    I have several sites that were doing quite well in terms of organic SEO rankings. I have the important sites setup in Google's Webmaster tools. Long story short, the system was down for about two weeks. Now in AdSense and Analytics, I am seeing that the page views are SLOWLY increasing. and I would like to know if there is anything I can do now to try to expedite the process of regaining those positions. Since there were several errors from that server, is it possible that Google will now rank any site from that IP address lower due to those two weeks of errors? Is this something that I just have to let ride out? Thanks.

    Read the article

  • redirecting subdomain to root index.php

    - by niku
    I am new to this. Here is the situation and wondering if someone can suggest best solution to it. I have domain "www.mydomain.com" where I have magento website running, we are in development stage so I did URL forwarding "www.mydomain.com" to "www.mydomain.net" and we have under-construction page on "www.mydomain.net'. Because we do not want to show development. I also have subdomain "beta.mydomain.com" which I pointed to "www.mydomain.com/index.php" which works fine. But how can I show this without changing URL in browser from "beta.mydomain.com" to "www.mydomain.com/index.php" this we want to show our development to management.

    Read the article

  • Global Adopt a JSR Program Update

    - by heathervc
    The Global Adopt a JSR program, combining efforts of SouJava and London Java Community,  is an excellent place to get some Java User Group (JUG) resources for JSRs.  It also has the potential to act as an extra set of eyes, ears and volunteers for JSRs. The Global project to go to is at: http://adoptajsr.java.net.  The wiki page explaining the whole program and benefits to Spec Leads and EG's can be found there, including: The mailing list: [email protected] . Portugese speakers-mainly Brazlian JUG members-have their own mailing list and more language lists may be added as required. The IRC channel is at adoptajsr on irc.freenode.net Also check out this InfoQ article with Martijn Verburg about the London Java user group, the Adopt a JSR program, the JCP and Oracle’s handling of the Java community.

    Read the article

  • How to identify a PDF classification problem?

    - by burtonic
    We are crawling and downloading lots of companies' PDFs and trying to pick out the ones that are Annual Reports. Such reports can be downloaded from most companies' investor-relations pages. The PDFs are scanned and the database is populated with, among other things, the: Title Contents (full text) Page count Word count Orientation First line Using this data we are checking for the obvious phrases such as: Annual report Financial statement Quarterly report Interim report Then recording the frequency of these phrases and others. So far we have around 350,000 PDFs to scan and a training set of 4,000 documents that have been manually classified as either a report or not. We are experimenting with a number of different approaches including Bayesian classifiers and weighting the different factors available. We are building the classifier in Ruby. My question is: if you were thinking about this problem, where would you start?

    Read the article

  • How can I alias domains to subdomains?

    - by user745668
    I have a main site with a bunch of subdomains created. Each subdomain is a blog and I want each blog to have its own domain name i.e. thisguy.com - blog1.mainsite.com thatguy.com - blog2.mainsite.com I bought the new domains and I set up the CNAME records as above to alias them to the appropriate subdomains. However, I get my hosts "a domain is pointing to one of our servers but we don't know anything about it" landing page. How can I set up these domains as aliases of my subdomains?

    Read the article

  • Joomla url issue with sh404SEF

    - by user5858
    it's couple of months I've been using SH404SEF With my site. But in my site I'm getting url's in the form: http://www.downloadformsindia.com/Income-Tax-Forms/all-income-tax-return-itr-forms-2010-2011.html?task=view If I remove this suffix(?task=view), it takes us to the same page. I had raised this issue in SH404SEF forum, and I was told that this data is taken as parameter by search engines hence ignored. I want to redirect using RewriteMatch in .htaccess all such url's to the url's without ?task=view ones : ....downloadformsindia.com/Income-Tax-Forms/all-income-tax-return-itr-forms-2010-2011.html?task=view to be redirected to http://www.downloadformsindia.com/Income-Tax-Forms/all-income-tax-return-itr-forms-2010-2011.html So my question is: Will this redirection create 404's in the Google webmaster. I've thousand's of pages in the site

    Read the article

  • How I improve my problem-solving ability

    - by gcc
    How we can improve our problem-solving ability ? Every one says same thing "real programmer knows how to handle real problem.", but they forget something how they take this ability, or where ( I know in school, no one gives us any ability, of course in my opinion. ) If you have any idea except above ones, feel free when you give an advice solve more problem do more exercise, write code, search google then write more ... For me, my question is like "Use complex/known library instead of using your own." In other words, I want your experience, book recommendation, web page

    Read the article

  • Is there a way I can verify my Google Analytics custom report?

    - by SnowboardBruin
    I want to track scrolling on my website since it's a long page (rather than multiple pages). I saw several different methods, with and without an underscore for trackEvent, with and without spaces between commas <script> ... ... ... ga('create', 'UA-45440410-1', 'example.com'); ga('send', 'pageview'); _gaq.push([‘_trackEvent’, ‘Consumption’, ‘Article Load’, ‘[URL]’, 100, true]); _gaq.push([‘_trackEvent’, ‘Consumption’, ‘Article Load’, ‘[URL]’, 75, false]); _gaq.push([‘_trackEvent’, ‘Consumption’, ‘Article Load’, ‘[URL]’, 50, false]); _gaq.push([‘_trackEvent’, ‘Consumption’, ‘Article Load’, ‘[URL]’, 25, false]); </script> It takes a day for counts to load with Google Analytics, otherwise I would just tweak and test right now.

    Read the article

< Previous Page | 804 805 806 807 808 809 810 811 812 813 814 815  | Next Page >