Search Results

Search found 7807 results on 313 pages for 'dreamweaver sites'.

Page 8/313 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • .NET 4.5 now supported with Windows Azure Web Sites

    - by ScottGu
    This week we finished rolling out .NET 4.5 to all of our Windows Azure Web Site clusters.  This means that you can now publish and run ASP.NET 4.5 based apps, and use  .NET 4.5 libraries and features (for example: async and the new spatial data-type support in EF), with Windows Azure Web Sites.  This enables a ton of really great capabilities - check out Scott Hanselman’s great post of videos that highlight a few of them. Visual Studio 2012 includes built-in publishing support to Windows Azure, which makes it really easy to publish and deploy .NET 4.5 based sites within Visual Studio (you can deploy both apps + databases).  With the Migrations feature of EF Code First you can also do incremental database schema updates as part of publishing (which enables a really slick automated deployment workflow). Each Windows Azure account is eligible to host 10 free web-sites using our free-tier.  If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using them today. In the next few days we’ll also be releasing support for .NET 4.5 and Windows Server 2012 with Windows Azure Cloud Services (Web and Worker Roles) – together with some great new Azure SDK enhancements.  Keep an eye out on my blog for details about these soon. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • .NET 4.5 agora é suportado nos Web Sites da Windows Azure

    - by Leniel Macaferi
    Nesta semana terminamos de instalar o .NET 4.5 em todos os nossos clusters que hospedam os Web Sistes da Windows Azure. Isso significa que agora você pode publicar e executar aplicações baseadas na ASP.NET 4.5, e usar as bibliotecas e recursos do .NET 4.5 (por exemplo: async e o novo suporte para dados espaciais (spatial data type no Entity Framework), com os Web Sites da Windows Azure. Isso permite uma infinidade de ótimos recursos - confira o post de Scott Hanselman com vídeos (em Inglês) que destacam alguns destes recursos. O Visual Studio 2012 inclui suporte nativo para publicar uma aplicação na Windows Azure, o que torna muito fácil publicar e implantar sites baseados no .NET 4.5 a partir do Visual Studio (você pode publicar aplicações + bancos de dados). Com o recurso de Migrações da abordagem Entity Framework Code First você também pode fazer atualizações incrementais do esquema do banco de dados, como parte do processo de publicação (o que permite um fluxo de trabalho de publicação extremamente automatizado). Cada conta da Windows Azure é elegível para hospedar até 10 web-sites gratuitamente, usando nossa camada Escalonável "Compartilhada". Se você ainda não tem uma conta da Windows Azure, você pode inscrever-se em um teste gratuito para começar a usar estes recursos hoje mesmo. Nos próximos dias, vamos também lançar o suporte para .NET 4.5 e Windows Server 2012 para os Serviços da Nuvem da Windows Azure (Web e Worker Roles) - juntamente com algumas novas e ótimas melhorias para o SDK da Windows Azure. Fique de olho no meu blog para mais informações sobre estes lançamentos em breve. Espero que ajude, - Scott PS Além do blog, eu também estou agora utilizando o Twitter para atualizações rápidas e para compartilhar links. Siga-me em: twitter.com/ScottGu Texto traduzido do post original por Leniel Macaferi.

    Read the article

  • Taking web sites offline for demonstration

    While working in software development in general, and in web development for a couple of customers it is quite common that it is necessary to provide a test bed where the client is able to get an image, or better said, a feeling for the visions and ideas you are talking about. Usually here at IOS Indian Ocean Software Ltd. we set up a demo web site on one of our staging servers, and provide credentials to the customer to access and review our progress and work ad hoc. This gives us the highest flexibility on both sides, as the test bed is simply online and available 24/7. We can update the structure, the UI and data at any time, and the client is able to view it as it suits best for her/him. Limited or lack of online connectivity But what is going to happen when your client is not capable to be online - no matter for what reasons; here are some more obvious ones: No internet connection (permanently or temporarily) Expensive connection, ie. mobile data package, stay at a hotel, etc. Presentation devices at an exhibition, ie. using tablets or iPads Being abroad for a certain time, and only occasionally online No network coverage, especially on mobile Bad infrastructure, like ie. in Third World countries Providing a catalogue on CD or USB pen drive Anyway, it doesn't matter really. We should be able to provide a solution for the circumstances of our customers. Presentation during an exhibition Recently, we had the following request from a customer: Is it possible to let us have a desktop version of ResortWork.co.uk that we can use for demo purposes at the forthcoming Ski Shows? It would allow us to let stand visitors browse the sites on an iPad to view jobs and training directory course listings. Yes, sure we can do that. Eventually, you might think why don't they simply use 3G enabled iPads for that purpose? As stated above, there might be several reasons for that - low coverage, expensive data packages, etc. Anyway, it is not a question on how to circumvent the request but to deliver a solution to that. Possible solutions... or not? We already did offline websites earlier, and even established complete mirrors of one or two web sites on our systems. There are actually several possibilities to handle this kind of request, and it mainly depends on the system or device where the offline site should be available on. Here, it is clearly expressed that we have to address this on an Apple iPad, well actually, I think that they'd like to use multiple devices during their exhibitions. Following is an overview of possible solutions depending on the technology or device in use, and how it can be done: Replication of source files and database The above mentioned web site is running on ASP.NET, IIS and SQL Server. In case that a laptop or slate runs a Windows OS, the easiest way would be to take a snapshot of the source files and database, and transfer them as local installation to those Windows machines. This approach would be fully operational on the local machine. Saving pages for offline usage This is actually a quite tedious job but still practicable for small web sites Tool based approach to 'harvest' the web site There quite some tools in the wild that could handle this job, namely wget, httrack, web copier, etc. Screenshots bundled as PDF document Not really... ;-) Creating screencast or video Simply navigate through your website and record your desktop session. Actually, we are using this kind of approach to track down difficult problems in order to see and understand exactly what the user was doing to cause an error. Of course, this list isn't complete and I'd love to get more of your ideas in the comments section below the article. Preparations for offline browsing The original website is dynamically and data-driven by ASP.NET, and looks like this: As we have to put the result onto iPads we are going to choose the tool-based approach to 'download' the whole web site for offline usage. Again, depending on the complexity of your web site you might have to check which of the applications produces the best results for you. My usual choice is to use wget but in this case, we run into problems related to the rewriting of hyperlinks. As a consequence of that we opted for using HTTrack. HTTrack comes in different flavours, like console application but also as either GUI (WinHTTrack on Windows) or Web client (WebHTTrack on Linux/Unix/BSD). Here's a brief description taken from the original website about HTTrack: HTTrack is a free (GPL, libre/free software) and easy-to-use offline browser utility. It allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site's relative link-structure. Simply open a page of the "mirrored" website in your browser, and you can browse the site from link to link, as if you were viewing it online. And there is an extensive documentation for all options and switches online. General recommendation is to go through the HTTrack Users Guide By Fred Cohen. It covers all the initial steps you need to get up and running. Be aware that it will take quite some time to get all the necessary resources down to your machine. Actually, for our customer we run the tool directly on their web server to avoid unnecessary traffic and bandwidth. After a couple of runs and some additional fine-tuning - explicit inclusion or exclusion of various external linked web sites - we finally had a more or less complete offline version available. A very handsome feature of HTTrack is the error/warning log after completing the download. It contains some detailed information about errors that appeared on the pages and the links within the pages that have been processed. Error: "Bad Request" (400) at link www.resortwork.co.uk/job-details_Ski_hire:tech_or_mgr_or_driver_37854.aspx (from www.resortwork.co.uk/Jobs_A_to_Z.aspx)Error: "Not Found" (404) at link www.247recruit.net/images/applynow.png (from www.247recruit.net/css/global.css)Error: "Not Found" (404) at link www.247recruit.net/activate.html (from www.247recruit.net/247recruit_tefl_jobs_network.html) In our situation, we took the records of HTTP 400/404 errors and passed them to the web development department. Improvements are to be expected soon. ;-) Quality assurance on the full-featured desktop Unfortunately, the generated output of HTTrack was still incomplete but luckily there were only images missing. Being directly on the web server we simply copied the missing images from the original source folder into our offline version. After that, we created an archive and transferred the file securely to our local workspace for further review and checks. From that point on, it wasn't necessary to get any more files from the original web server, and we could focus ourselves completely on the process of browsing and navigating through the offline version to isolate visual differences and functional problems. As said, the original web site runs on ASP.NET Web Forms and uses Postback calls for interaction like search, pagination and partly for navigation. This is the main field of improving the offline experience. Of course, same as for standard web development it is advised to test with various browsers, and strangely we discovered that the offline version looked pretty good on Firefox, Chrome and Safari, but not in Internet Explorer. A quick look at the HTML source shed some light on this, and there are conditional CSS inclusions based on the user agent. HTTrack is not acting as Internet Explorer and so we didn't have the necessary overrides for this browser. Not problematic after all in our case, but you might have to pay attention to this and get the IE-specific files explicitly. And while having a view at the source code, we also found out that HTTrack actually modifies the generated HTML output. In several occasions we discovered that <div> elements were converted into <table> constructs for no obvious reason; even nested structures. Search 'e'nd destroy - sed (or Notepad++) to the rescue During our intensive root cause analysis for a couple of HTML/CSS problems that needed some extra attention it is very helpful to be familiar with any editor that allows search and replace over multiple files like, ie. sed - stream editor for filtering and transforming text on Linux or my personal favourite Notepad++ on Windows. This allowed us to quickly fix a lot of anchors with onclick attributes and Javascript code that was addressed to ASP.NET files instead of their generated HTML counterparts, like so: grep -lr -e '.aspx' * | xargs sed -e 's/.aspx/.html?/g' The additional question mark after the HTML extension helps to separate the query string from the actual target and solved all our missing hyperlinks very fast. The same can be done in Notepad++ on Windows, too. Just use the 'Replace in files' feature and you are settled. Especially, in combination with Regular Expressions (regex). Landscape of browsers Okay, after several runs of HTML/CSS code analysis, searching and replacing some strings in a pool of more than 4.000 files, we finally had a very good match of an offline browsing experience in Firefox and Chrome on Linux. Next, we transferred that modified set of files to a Windows 8 machine for review on Firefox, Chrome and Internet Explorer 7 to 10, and a Mac mini running Mac OS X 10.7 to check the output on Safari and again on Chrome. Besides IE, for reasons already mentioned above, the results were identical. And last but not least it was about to check web site on tablets. Please continue to read on the following articles: Taking web sites offline for demonstration on Galaxy Tablet Taking web sites offline for demonstration on iPad

    Read the article

  • Chrome Mobile Monthly: Responsive vs Separate Sites

    Chrome Mobile Monthly: Responsive vs Separate Sites Join us on Wednesday October 31st at 9am PT for our Monthly Mobile Web Hangout! This month +Brad Frost will be joining us to talk about responsive design versus separate mobile sites. And in keeping with the season, it's a special Presidential Smackdown Edition. The US presidential race is in full swing, and the candidates are intensely debating the country's hot-button issues. The web design world is entrenched in our own debate about how to address the mobile web: should we create a separate mobile site or create a responsive experience instead? It just so happens that the two US presidential candidates have chosen different mobile web strategies for their official websites. In the red corner is Republican candidate Mitt Romney's dedicated mobile site, while in the blue corner is incumbent president Barack Obama's responsive website. Which will prevail? Sit back, crack open a cold one, and watch the battle unfold as Brad dissect the candidates' sites to uncover best practices and common mobile web pitfalls. From: GoogleDevelopers Views: 0 0 ratings Time: 00:00 More in Science & Technology

    Read the article

  • Using a CDN for CMS software (multiple sites)

    - by SmokeyPHP
    I'm currently researching ideas for the media management side of a CMS I'm writing. I was looking at having images served from a CDN which is fine on a single site, but I want all sites that run the CMS to make use of a CDN (which will most likely be a custom developed one, rather than a third party service like S3). My main question is: Is a multi-site CDN a good idea? I can't think of a downside, but have probably missed something - obviously they won't share the same folder, as I invisage the requests to be css.cdnsite.com/example.com/style.css or something along those lines. Having multiple sites in the same place will obviously make it easier for us to manage, as well as being cheaper, but then I wonder if it'll be worth it... Long story short: How should the CMS handle user uploaded media (separate installations) Just keep a local copy of all assets and serve them from the same site, like in days of yore? Keep a local copy, force site to use www. and have CDN subdomains per site? Or use a single separate CDN for all sites? Apologies for the length of this question, not sure if this should be multiple questions or not, as all parts are kind of related and could affect each other.

    Read the article

  • Unable to view 2 local sites over network

    - by gentrobot
    I have 2 websites running on my local machine that I'd like to view from other machines on the same network. For /etc/apache2/sites-available/site1.com: <VirtualHost *:80> ServerName site1.com DocumentRoot /var/www/answers/app/webroot DirectoryIndex index.php <Directory "/var/www/answers/app/webroot"> Options FollowSymLinks AllowOverride All Order allow,deny Allow from all </Directory> </VirtualHost> For /etc/apache2/sites-available/site1.com: <VirtualHost *:80> ServerName site2.com DocumentRoot /var/www/answers2/app/webroot DirectoryIndex index.php <Directory "/var/www/answers2/app/webroot"> Options FollowSymLinks AllowOverride All Order allow,deny Allow from all </Directory> </VirtualHost> I have added 2 entries in the /etc/hosts file as: 127.0.0.1 site1.com 127.0.0.1 site2.com Now, when I point the browser on my machine to site1.com, it shows me the first site and pointing the browser to site2.com, it shows me the second site. However,when I type in the local IP of my machine in the browser, it always shows site2. How can I change it to switch between site1 and site2 ? Is there a way that I can view both the sites form another machine (esp. mobile devices over wireless network) ?

    Read the article

  • Arguments for discouraging satellite sites?

    - by Jjdelc
    I am working with a client who's read about satellite sites on a SEP book and has been building hundreds of keyword reach domains (buytoyotacorona1989cheap.com, brandnewsuvinred.com) with specific content about such domains. They all link to a main domain (CompanyName.com) where most of the information is either repeated(from other sites) or new. I told him to drop all the other domains and only focus on building good content for the main site as it is too difficult to maintain so many websites, plus they might look like link farms to Google. He told me to make a Google search for "Buy Toyota cheap " and two of his websites were listed among top 10. So it's seem to be doing some good, but I get the feeling that what he is doing is wrong. What other arguments are there to discourage this practice? or is he doing the right thing? My arguments have helped him to decide go down from hundreds to close to one hundred (because cost of maintainance) but I believe he should only have one or two sites. PS: The business is not actually about cars.

    Read the article

  • Restrict only some plugins to specific sites in Google Chrome

    - by Christian
    I am looking for a way to set up Google Chrome so that it will run a certain plug-in (Java, what else?) only on whitelisted sites, but other plug-ins (like the PDF viewer) everywhere. From playing with the policies available for Chrome, I think there are basically two levels of plug-in management: List of disabled plugins/enabled plugins: Controls whether a plug-in exists for the browser at all This pair of policies applies to plug-ins, but not to sites. Default plug-in settings/Allow plug-ins on sites: Controls on which sites plug-ins can run This set of policies applies to sites, but not to individual plugins, and it cannot override the first pair. There appears to be no way to configure Chrome so that some plug-ins only run on whitelisted sites, but others run everywhere by default. I have also looked at filtering content on the firewall/proxy level, but I'm not convinced it can be done securely there. Filtering by URLs (file names) or content types can be circumvented trivially, and identification by content inspection cannot be safe either.

    Read the article

  • Explore Historic Sites from the Comfort of Your Desktop with Google’s ‘World Wonders Project’

    - by Asian Angel
    Have you always wanted to explore historic sites across the world but lack the extra time and/or funds to do so? Then take heart! Now you can visit historic sites to your heart’s content from home with Google’s ‘World Wonders Project’. Note: The screenshot shown above is from the ‘Archaeological Areas of Pompei’ site. You can explore exotic locations such as Pompei, the Palace and Park of Versailles, Shark Bay, the Tenryu-ji-Temple in Ancient Kyoto, and more. The World Wonders Project Homepage The World Wonders Project YouTube Channel HTG Explains: Learn How Websites Are Tracking You Online Here’s How to Download Windows 8 Release Preview Right Now HTG Explains: Why Linux Doesn’t Need Defragmenting

    Read the article

  • Joomla Sites hacked by DR-MTMRD [closed]

    - by RedLEON
    Possible Duplicate: My Sites Were Hacked. What To Do? A few of my joomla sites were hacked. After I became aware of this, I did these things: Changed hosting passwords (mysql, ftp, control panel) Renamed joomla admin user name to "admin" in users table (Hacker had changed the user name how?) Upgraded joomla latest Added php.ini root directory of host. Disabled cgi access But the site is still hacked. I checked up on the index.php file and owerwrite original index.php but the site is still hacked. How is this possible?

    Read the article

  • Permissions & File Structure w/ nginx & multiple sites

    - by Michael
    I am using nginx for the first time as a long time Apache user. I setup a Linode to test everything and to eventually port over my websites. Previously I had /home/user/www (wwwroot) I am looking at doing something similar with /srv/www/domain/www (wwwroot) Rather than using /srv/domain (wwwroot), the reason is many of the sites are WordPress and one of the things I do for security is to move the config file one level above wwwroot and can't have multiple configuration files from multiple domains in the same top level folder. Since I own all the sites, I wasn't going to create a user for each domain. My user is a member of www-data and was going to use 2770 for www and have domain/www for each new domain. www would be owned by group www-data. Is this the best way to handle this?

    Read the article

  • Localhost permissions given different values in fireftp and cs4 dreamweaver

    - by YsoL8
    While testing a file uploader on my localhost ( mamp on mac ) I've hit a problem. Trying to fix a folder permissions problem, I used CS4 Dreamweaver's permissions screen to set 0777 permissions. However these wouldn't apply and stayed stuck on 0, so I opened fireftp and accessed the folder in the local panel. The permissions there are 0777. So I have a folder that has permissions of 0 and 0777 at the same time. How can I resolve this and make sure the permissions are 0777?

    Read the article

  • Best sites to find good .NET Developers

    - by Mag20
    I am looking for good sites to post a position for a .NET developer. I already tried: Craig's list got about 10 resumes, but most couldn't answer our technical questions StackOverflow Careers no responses What sites did you have success with finding good developers? UPDATE 1: Wanted to provide some more information: My company is in NJ. We are a small startup. Less then 10 people. Monster, Dice, CareerBuilder all charge like $500 a month per posting. Seems a bit much. Also only Dice is specifically targeting technical positions. With monster and career builder I am a bit worried about having to go through hundreds of resumes that don't apply.

    Read the article

  • Neues in WebCenter Sites 11g

    - by pweckerl
    Es ist kein Geheimnis, dass das Online Erlebnis sich durch das Social Computing grundlegend geändert hat. Immer öfter wollen Besucher einer Web Site nicht nur konsumieren sonder auch interagieren und ihre Erfahrungen über Soziale Netzwerke mit Anderen teilen. Für Online-Marketies eröffnet dies eine vielzahl an Möglichkeiten aber auch Herausforderungen. Unternehmen müssen diese sozialen Komponenten in ihre Online Auftritte integrieren um die Erwartung nach einem interaktiven Erlebnis zu erfüllen aber zugleich die Kontrolle und damit ein gewisses Maß an Sicherheit für integrität der eigenen Marke und des eigenen Rufs zu garantieren. Mit der neuen Version von Oracle WebCenter Sites steht Online-Verantwortlichen ein umfassendes Werkzeug zur Verfügung, um ihre Auftritte noch interaktiver zu gestalten und die Besucher noch enger einzubeziehen. Social Login und Social Sharing, User Generated Content, wie Bewertungen und Kommentare, und viele weitere Neuerungen machen Oracle WebCenter Sites besser denn je. Mehr zur aktuellen Version und zu WebCenter Themen allgemein finde Sie auch auf dem Oracle WebCenter Blog (https://blogs.oracle.com/webcenter/entry/what_s_new_in_webcenter1).

    Read the article

  • Services - Separate Sites or One Site - Impact on SEO

    - by Lynda
    I have a client who is a lawyer that specializes in Criminal Defense and DUI, however, he does not show up well in Google. In researching the sites that rank better have much more content for those specialties than his site does and my thought it that he needs to add more quality content to rank better for those searches. On his site he mentions his specialties, but also he has various personal things on his sites that reflect his interest. These are clearly separated from the business portion. My questions are should he 1) separate his personal information into a new domain and 2) should he have a separate URL for each of his specialties? OR would one URL work as long as everything is clearly separated? I read once that for legal services to rank well you should make a separate site for each specialty and have that site focus solely on that service.

    Read the article

  • Yahoo annonce un accord avec Facebook, les deux sites renforcent leurs liens et deviennent partenair

    Yahoo annonce un accord avec Facebook, les deux sites renforcent leurs liens et deviennent partenaires Yahoo et Facebook viennent d'annoncer un rapprochement entre leurs deux sites. En effet, leurs services sont à présent liés sous certains aspects. Un internaute possédant un compte sur les deux plateformes, pourra ainsi voir des notifications relatives à ses activités sur un site, apparaitre sur l'autre. De plus, les utilisateurs de certains services proposés par Yahoo (comme Flickr ou Yahoo Answer), pourront partager plus facilement leurs données y étant hébergées avec leurs amis Facebook. Il leur sera aussi possible de mettre à jour à la fois leur profil Yahoo ainsi que leur profil Facebook en une ...

    Read the article

  • Why subdomains of Blogspot/WordPress like sites are treated as different domains or sites?

    - by Thedijje
    As I know, maps.google.com or mail.google.com all comes under the same domain and its all are subdomain. Entire web treats these subdomain as the part of main domain and they have same Alexa rank, PageRank and all. But in another hand, take a look on blogspot.com/wordpress.com/webs.com; these are different sites but blogs or websites under those domains are treated as different sites. Its new URL, all have different PageRank and Alexa rank as well. Tts about millions of subdomains under those few domain, have almost similar IP address, hosting and CMS, still why they are called different domains?

    Read the article

  • HTG Explains: How Hackers Take Over Web Sites with SQL Injection / DDoS

    - by Jason Faulkner
    Even if you’ve only loosely followed the events of the hacker groups Anonymous and LulzSec, you’ve probably heard about web sites and services being hacked, like the infamous Sony hacks. Have you ever wondered how they do it? There are a number of tools and techniques that these groups use, and while we’re not trying to give you a manual to do this yourself, it’s useful to understand what’s going on. Two of the attacks you consistently hear about them using are “(Distributed) Denial of Service” (DDoS) and “SQL Injections” (SQLI). Here’s how they work. Image by xkcd HTG Explains: How Hackers Take Over Web Sites with SQL Injection / DDoS Use Your Android Phone to Comparison Shop: 4 Scanner Apps Reviewed How to Run Android Apps on Your Desktop the Easy Way

    Read the article

  • Sites with overlapping code-bases. Developing multiple sites with little changes

    - by Web Developer
    I have to develop 3 different sites video.com for hosting video audio.com for hosting audio docs.com for hosting docs. domain names for example only Almost 80% of the functionality is the same for all the three, with remaining 20% being completely different features... How do I handle this? How does sites like SO handle this? I am developing this in YII framework and was thinking of having these different features as modules but in this case the menu/code links in html code can become difficult.

    Read the article

  • 1000 most visited sites on the web: A Google Analysis

    Google has released an analysis on the 1000 most visited sites on the web. Considering that we own/operate 3 of the top 10 sites and has a significant interest in Facebook, plus this recent report that states that Microsoft employees are the most social-media-savvy will go to great lengths to show how well we can operate in our cloud and social media integration and collaboration strategies. William Tay 2000-2010 | Swinging Technologist http://www.softwaremaker.net/blog...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Apple font question

    - by creocare
    So I just bought a new macbook pro 15'. This is my first mac. I really hate the apple font in dreamweaver or when I'm doing any kind of coding. I can't read it. I should buy glasses but in the mean time is there any way to change the font and font size of dreamweaver? Thanks.

    Read the article

  • How to set default save location in applications

    - by user23950
    Especially in dreamweaver and photoshop. And do you know of any application that lets me jump through all the common save locations for dreamweaver. For example, I save .php files on C:\Wamp\www. And then I created a new site, which will has default save location to be:C:\Users\username\Documents\Unnamed Site 2

    Read the article

  • Can't connect to certain HTTPS sites

    - by mind.blank
    I've just moved to a new apartment and with internet connection via a router and I'm finding that I can't connect to quite a few sites that use SSL. For example trying to connect to PayPal: curl -v https://paypal.com * About to connect() to paypal.com port 443 (#0) * Trying 66.211.169.3... connected * successfully set certificate verify locations: * CAfile: none CApath: /etc/ssl/certs * SSLv3, TLS handshake, Client hello (1): * Unknown SSL protocol error in connection to paypal.com:443 * Closing connection #0 curl: (35) Unknown SSL protocol error in connection to paypal.com:443 curl -v -ssl https://paypal.com gives the same output. For some sites it works: curl -v https://www.google.com * About to connect() to www.google.com port 443 (#0) * Trying 74.125.235.112... connected * successfully set certificate verify locations: * CAfile: none CApath: /etc/ssl/certs * SSLv3, TLS handshake, Client hello (1): * SSLv3, TLS handshake, Server hello (2): * SSLv3, TLS handshake, CERT (11): * SSLv3, TLS handshake, Server key exchange (12): * SSLv3, TLS handshake, Server finished (14): * SSLv3, TLS handshake, Client key exchange (16): * SSLv3, TLS change cipher, Client hello (1): * SSLv3, TLS handshake, Finished (20): * SSLv3, TLS change cipher, Client hello (1): * SSLv3, TLS handshake, Finished (20): * SSL connection using ECDHE-RSA-RC4-SHA * Server certificate: * subject: C=US; ST=California; L=Mountain View; O=Google Inc; CN=www.google.com * start date: 2011-10-26 00:00:00 GMT * expire date: 2013-09-30 23:59:59 GMT * common name: www.google.com (matched) * issuer: C=ZA; O=Thawte Consulting (Pty) Ltd.; CN=Thawte SGC CA * SSL certificate verify ok. > GET / HTTP/1.1 > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 > Host: www.google.com > Accept: */* > < HTTP/1.1 302 Found < Location: https://www.google.co.jp/ . . . I'm using Ubuntu 12.04, with Windows 7 installed as well. These sites work on Windows :( Not sure if this information helps but I ran ifconfig and got the following: eth0 Link encap:Ethernet HWaddr 1c:c1:de:bc:e2:4f inet6 addr: 2408:c3:7fff:991:686b:8d18:81b3:8dd1/64 Scope:Global inet6 addr: 2408:c3:7fff:991:1ec1:deff:febc:e24f/64 Scope:Global inet6 addr: fe80::1ec1:deff:febc:e24f/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:87075 errors:0 dropped:0 overruns:0 frame:0 TX packets:54522 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:78167937 (78.1 MB) TX bytes:10016891 (10.0 MB) Interrupt:46 Base address:0x4000 eth1 Link encap:Ethernet HWaddr ac:81:12:0d:93:80 inet6 addr: fe80::ae81:12ff:fe0d:9380/64 Scope:Link UP BROADCAST MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:498 TX packets:0 errors:26 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:0 (0.0 B) TX bytes:0 (0.0 B) Interrupt:17 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:630 errors:0 dropped:0 overruns:0 frame:0 TX packets:630 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:39592 (39.5 KB) TX bytes:39592 (39.5 KB) ppp0 Link encap:Point-to-Point Protocol inet addr:180.57.228.200 P-t-P:118.23.8.175 Mask:255.255.255.255 UP POINTOPOINT RUNNING NOARP MULTICAST MTU:1492 Metric:1 RX packets:39631 errors:0 dropped:0 overruns:0 frame:0 TX packets:22391 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:3 RX bytes:43462054 (43.4 MB) TX bytes:2834628 (2.8 MB)

    Read the article

  • What Do Your Customers Want in an Online Experience?

    - by Christie Flanagan
    In a time where customers have an increasing number of choices and an increasing level of control over their relationships with brands, what matters most is engagement. In order to engage your customers online, you need to provide them with a relevant, interactive and multichannel experience.  Check out this video to see the kind of engaging online experience that Oracle WebCenter can power for your customers. Want to learn more?  Visit our Connected Customer Experience Resource Center to: See a demonstration of how easy it is for marketers and other non–technical business users to create and manage online experiences like the one above with Oracle WebCenter Sites Hear Ancestry.com describe how they use Oracle WebCenter Sites to deliver an online experience that converts site visitors into customers and keeps them coming back to learn more about their family histories Hear what analysts are saying about the exciting new and enhanced web experience management capabilities in Oracle WebCenter Sites 11g

    Read the article

  • What is /etc/apache2/sites-available used for and is it necessary?

    - by Mariane
    I have 3 sites, each with a specific IP, running on apache2 (up-to-date Ubuntu). To put a site online, I just created a file in: /etc/apache2/sites-enabled and in this file I told apache which directory was the root directory for this site, and to which IP it should correspond. So I have 000-default 001-www.lapf.eu 002-www.felkin.info 003-www.seidhr.fr in this directory. My first site, lapf suddenly lost contact with its database after the domain name was transferred from another registrar unto the registrar who is also hosting the site's data. Then I did an update, and I reinstalled mysql-server and mysql-common, and I did I-have-forgotten-what to reinstall the locales (uft8 and such) which had vanished for some reason. This fixed my first site. Now I noticed that the other 2 sites are offline. Pointing a browser to them just hangs until timeout. They used to function, and their domain names did not move, they are still registered at the same place. The files are still in /etc/apache2/sites-enabled I noticed another directory: /etc/apache2/sites-available with just defaut and default.ssl in it. Why are there 2 directories, sites-enabled and sites-available? Should I copy the files from "sites-enabled" into "sites-available"? Or should I put a modified version of each in "sites-available"? command: "apache2ctl -S" VirtualHost configuration: 92.243.20.169:80 Charlotte (/etc/apache2/sites-enabled/001-www.lapf.eu:1) 92.243.21.141:80 xvm-21-141.ghst.net (/etc/apache2/sites-enabled/002-www.felkin.info:1) 92.243.4.114:80 xvm-4-114.ghst.net (/etc/apache2/sites-enabled/003-www.seidhr.fr:1) wildcard NameVirtualHosts and default servers: *:80 is a NameVirtualHost default server Charlotte (/etc/apache2/sites-enabled/000-default:1) port 80 namevhost Charlotte (/etc/apache2/sites-enabled/000-default:1) Syntax OK

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >