Search Results

Search found 21550 results on 862 pages for 'www jacob'.

Page 119/862 | < Previous Page | 115 116 117 118 119 120 121 122 123 124 125 126  | Next Page >

  • FreeBSD Ngnix installation error

    - by Asaf Nevo
    I have a VPS which has Apache webserver installed. I'm trying to install Ngnix on it since my new server will be needing to handle large amount of connection simultaneously. I used this install guide and did: cd /usr/ports/www/nginx make install clean However I get this error: adding module in /usr/ports/www/nginx/work/arut-nginx-dav-ext-module-0e07a3e ./configure: error: no /usr/ports/www/nginx/work/arut-nginx-dav-ext-module-0e07a3e/config was found ===> Script "configure" failed unexpectedly. I'm pretty new to FreeBSD and I am used to controlling my server using Direct Admin. What shall I do next ?

    Read the article

  • Trying to test Domain Collapsing / Consoldiation validity for SEO purposes

    - by Roy Rico
    At work, we're trying to determine the effectiveness of domain collapsing for SEO purposes. Our current structure is to have multiple web apps served from different servers, such as PUBLIC URLS - directly accessed by users www1.somecompany.com/webapp1 www2.somecompany.com/webapp2 www3.somecompany.com/webapp3 I'm proposing to put an Apache proxy in front of these applications that will mask the different domains and route the requests to proper server PUBLIC URL--------routed/forwarded to-----PRIVATE URL www.somecompany.com/webapp1 <-----> www1.somecompany.com/webapp1 www.somecompany.com/webapp2 <-----> www2.somecompany.com/webapp2 www.somecompany.com/webapp3 <-----> www3.somecompany.com/webapp3 In terms of SEO/page rank value, does this help?

    Read the article

  • Configuring vsftpd with nginx on ubuntu

    - by arby
    I have vsftpd installed on Ubuntu 12.04LTS along with nginx, php, and sql on an Amazon ec2 instance. The web server is good to go, but I'm having trouble connecting to the FTP server. I'm not quite sure how to set the privileges or what configuration options I might be missing. By default, the location of the web root is at /usr/share/nginx/www and it is owned by root:root. The web server runs as user www-data in the group www-data. I've opened port 21 and set the passive ports in the ec2 backend and ufw firewall. In vsftpd.conf, I have: ... anonymous_enable=NO local_enable=YES local_umask=0027 chroot_local_user=YES pasv_enable=YES pas_max_port=12100 pasv_min_port=12000 port_enable=YES ... Now, I'm unsure how to create the FTP user that when I login, displays my web directory with write access. I've tried it a few different ways, but I keep running into errors (either no connection, no write access, or very slow timeouts.)

    Read the article

  • redirect an old URL using web.config

    - by Dog
    i'm still very new to using URL rewrites and redirects and i'm having some problems on something i thought was quite simple... i've just rebuilt a website and want to redirect the old URLs to the new ones... for example : http://www.mydomain.com/about.asp?lang=1 should now be http://www.mydomain.com/content.asp?id=100230&title=about&langid=1 unfortunately, everything i've tried is giving me errors or simply does nothing. here is one rule i tried : <rule name="redirectoldabout" enabled="true" stopProcessing="true"> <match url="( .*)" negate="true" /> <conditions> <add input="{HTTP_HOST}" pattern="^mydomain\.com/about\.asp\?lang=1$" /> </conditions> <action type="Redirect" url="http://www.mydomain.com/content.asp?id=100230&title=about&langid=1" redirectType="Permanent" /> but i get an error back : HTTP Error 500.19 - Internal Server Error The requested page cannot be accessed because the related configuration data for the page is invalid. any suggestions as to what i'm doing wrong? thanks dog

    Read the article

  • Best solution for getting referral information in PHP

    - by absentx
    I am currently redoing some link structuring on a website. In the past we have used specific php files on the last step to direct the user to the proper place. Example: www.mysite.com/action/go-to-blue.php or www.mysite.com/action/short/go-to-red.php www.mysite.com/action/tall/go-to-red.php We are now restructuring to eliminate the /short/ or /tall/ directory. What this means is now "go-to-blue.php" will be doing some extra processing to make sure it sends the visitor to the proper place. The static method of the past was quite effective, because, well, if they left from that page we knew we had it right. Now since we are 301 redirecting action/short/go-to-red.php to just action/go-to-red.php it is quite important on "go-to-red.php" that we realize a user may have been redirected from /short/ or /tall/. So right now I am using HTTP_REFERRER and of course in my testing that works fine, but after a lot of reading it is clear that this is not a solid solution, so I was starting to brainstorm on other ways to check and make sure we get the proper referral information. If we could check HTTP_REFERRER plus some other test, I would feel confident we have a pretty good system in place to send the visitor to the right place. Some questions/comments: Could I use a session variable or a cookie to accomplish this goal? If so, would that be maintained through the 301 redirect? I don't see why it wouldn't be.. Passing the url in the url is not an option in this case.

    Read the article

  • Why was my site rejected for Google Adsense?

    - by hyuun jjang
    I have a 3 year old blog and its got around 16 articles/tutorials about some programming problems and solutions. It's getting pretty much a lot of view lately so I decided to apply for a google adsense account. When I first applied via blogger, google replied with the following statement: Page Type: In order to participate in Google AdSense, publishers' websites and application information must satisfy the following guidelines: - Your website must be your own top-level domain (www.example.com and not www.example.com/mysite). - You must provide accurate personal information with your application that matches the information on your domain registration. - Your website must contain substantial, original content... So, as I understood it, I decide to buy a domain and point my blogger blog to that new naked domain. and here is the newly bought domain where all the contents of my old blog resides. http://icodeya.com/ I reapplied, hoping that this time, I will make the cut. But then I got this reply Further detail: Unable to review your site: While reviewing http://www.icodeya.com/, we found that your site was down or unavailable. We suggest you check whether there was a typo in the URL submitted. When your site is operational, you can resubmit your application with the correct site by following the directions below. I'm a bit disappointed. Maybe I did something wrong with DNS configuration or something. But you can clearly see that my site is fully functional. I heard that google sends robots to crawl on to the site etc. It's just sad because I invested on a domain name, and now I can't even find ways to earn from it. Any tips?

    Read the article

  • why do i get an SPF Softfail?

    - by johnlai2004
    I installed SPF on my LAMP server with postfix. But for some reason, I get this error Received-SPF: softfail (mta1070.mail.re4.yahoo.com: domain of transitioning [email protected] does not designate 1.1.1.1 as permitted sender) I have two questions: 1) how do I trouble shoot this error 2) I've been looking through my configuration files in an attempt to change [email protected] to [email protected] because anotherurl.com has the correct SPF TXT records. Where do i go to change this? I tried editing myhostname under /etc/postfix/main.cf, but it didn't do anything.

    Read the article

  • How to add command line arguments to command line arguments in Windows shortcut?

    - by Pawin
    I know I can add a command line argument/option to a shortcut this way; for example: "C:\Program Files\Internet Explorer\iexplore.exe" www.a.com So IE will connect to a.com when it starts up. What I would like to do is to get IE connecting to a.com when I call it through another program like the following: C:\Windows\SysWOW64\ForceBindIP.exe 192.168.1.151 "C:\Program Files\Internet Explorer\iexplore.exe" www.a.com This does not work. IE starts up but doesn't go to a.com. It seems like the argument is either ignored or is understood as an argument of ForceBindIP instead (I'm not sure). What I am trying to do is to create 2 IE shortcuts such each of them binds one IE window to one NIC and one particular website. So adding the www.a.com etc in its startup list won't help. OS is Windows 8. Apologize if this has been asked and answered before. Please suggest keywords for searching if that's the case.

    Read the article

  • Using mod_rewrite to shutdown website.

    - by moolagain
    Hi, I am trying to shutdown a website to everyone except my ip address. I almost have it working. I cannot access www.mysite.com, but I can access all folders that have another .htaccess file in them. I have a .htaccess file in /www with the following code: #Use this when website is down RewriteEngine on #this allows access through my ip RewriteCond %{REMOTE_ADDR} !^(66\.777\.888\.99)$ RewriteRule !down.php$ /down.php [L] Some folders in my site have .htaccess files in them. If I have a file with the line: RewriteEngine on I can still access the folder. For example, if I have the second .htaccess file in /www/about, then I can still access mysite.com/about (but the .css file included on that page actually loads down.php). If I delete "RewriteEngine on" I get redirected to down.php. Any ideas? I think my mod_rewrite gets confused with multiple .htaccess files. Thanks!

    Read the article

  • Hosting a site on amazon ec2

    - by Khalid Mushtaq
    I have recently bought an amazon ec2 instance. Now I want to host a website. I have googled and found some useful info but there is some confusion in my mind. Suppose domain name is "http://www.example.com" That's what I have done so far. I have configured my domain locally on amazon ec2 instance and it's working fine when I open that url in amazon ec2 instance's browser. I have used http://www.example.com in /etc/hosts file point it to 127.0.0.1 to open locally on instance. I have got one elastic ip address and associated it with the instance. I have changed http://www.example.com A's record with the elastic IP that I have got in above step. Now what should I do? When some user will open my website anywhere in the world, will it get pointed to my instanace's ip address? Have I done proper configurations for website on instance?

    Read the article

  • root folder php scripts not running in nginx

    - by Thermionix
    nginx with php-fpm on ubuntu 12.04 server. attempting to access /var/www/test.php (via https://example.net/test.php) downloads the script instead of executing it. if I place the test.php in a subdirectory, i.e. /var/www/test/test.php it executes. root.conf; root /var/www; include php-fpm.conf; location ~ /\. { access_log off; log_not_found off; deny all; } php-fpm.conf; location ~ \.php$ { try_files $uri =404; fastcgi_pass unix:/var/run/php5-fpm.socket; include fastcgi_params; } fastcgi_params; fastcgi_param QUERY_STRING $query_string; fastcgi_param REQUEST_METHOD $request_method; fastcgi_param CONTENT_TYPE $content_type; fastcgi_param CONTENT_LENGTH $content_length; fastcgi_index index.php; fastcgi_param HTTPS on; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; #fastcgi_param SCRIPT_FILENAME $request_filename; fastcgi_param SCRIPT_NAME $fastcgi_script_name; fastcgi_param REQUEST_URI $request_uri; fastcgi_param DOCUMENT_URI $document_uri; fastcgi_param DOCUMENT_ROOT $document_root; fastcgi_param SERVER_PROTOCOL $server_protocol; fastcgi_param GATEWAY_INTERFACE CGI/1.1; fastcgi_param SERVER_SOFTWARE nginx/$nginx_version; fastcgi_param REMOTE_ADDR $remote_addr; fastcgi_param REMOTE_PORT $remote_port; fastcgi_param SERVER_ADDR $server_addr; fastcgi_param SERVER_PORT $server_port; fastcgi_param SERVER_NAME $server_name; # PHP only, required if PHP was built with --enable-force-cgi-redirect fastcgi_param REDIRECT_STATUS 200;

    Read the article

  • CISCO WS-C4948-10GE SFP+?

    - by Brian Lovett
    I have a pair of CISCO WS-C4948-10GE's that we need to connect to a new switch that has SFP+ and QSFP ports. Is there an X2 module that supports this? If so, can someone name the part number that will work? I have found some information, but want to make sure I have the correct part number for our exact switches. Per the discussion in the comments, I believe I have a better understanding of things now. Would I be correct in saying that I need these SR modules on the cisco side: [url]http://www.ebay.com/itm/Cisco-original-used-X2-10GB-SR-V02-/281228948970?pt=LH_DefaultDomain_0&hash=item417a8d35ea[/url] Then, on the switch with sfp+ ports, I can pick up an SR to SFP+ transceiver like this: [url]http://www.advantageoptics.com/SFP-10G-SR_lp.html?gclid=CKP4s-G27b4CFXQiMgodLD8AQA[/url] and finally, an SR calbe such as this: [url]http://www.colfaxdirect.com/store/pc/viewPrd.asp?idproduct=1551[/url] Am I on the right track here?

    Read the article

  • DevTeach Montreal 2012

    - by pluginbaby
    Like every time I am extremely pleased to see DevTeach coming back to my city! DevTeach Montreal will take place in Delta Hotel Centre-Ville on December 10-12th 2012. Note: You need to register to attend. Awesome Content 48 sessions in 8 tracks. 3 Post Conference workshops: Azure, Windows Store apps, BI.     Free events But everything is not paid! DevTeach is strongly dedicated to the dev community and you will also find those free activities (meaning you don’t have to be a conference attendee): Keynote On December 10th at 6:30pm DevTeach keynote is free for anyone (but you need to register). The keynote will be done by Howard Dierking who is a Program Manager on the Azure Application Platform and Tools team. > http://www.devteach.com/Keynote.aspx   Windows Server 2012 Hands-On IT Camp On December 11th at 6:30pm IT Camps presented by Pierre Roman. > http://www.devteach.com/community/   A Whirlwind Tour around Windows Phone 8 Development On December 11th at 6:30pm Windows Phone 8 Camp presented by Paul Laberge. > http://www.devteach.com/community/   See you there!

    Read the article

  • Subdomain still times out after set up a month ago

    - by user8137
    I'm a newbie on this and this has probably been asked already but the subjects online were close but too vague in there answers so I've probably really messed this up. I would really appreciate specific step by step instructions. This is what I'd like to do: use the subdomain www.high-res.domain.com to be accessed by external customers with specific permissions to access the site (like ftp). We use Network Solutions to house domain.com. We recently added a new ip address to point to www.high-res.domain.com. I gave the ip address to the company that hosts our website. I pinged www.high-res.domain.com and it points to the correct ip address but still times out. It’s been a few weeks now and when you ping it, it still times out. C:ping XXX.XXX.X.XXX Pinging XXX.XXX.X.XXX with 32 bytes of data: Request timed out. Request timed out. Request timed out. Request timed out. Ping statistics for XXX.XXX.X.XXX: Packets: Sent = 4, Received = 0, Lost = 4 (100% loss). Tracert times out as well. I even went to DNS tools and a few other sites for checking this and it shows the same thing. I recently went into the DNSmgmt on our server (wink2k3sp1) and created an A record under the DomainDnsZones which translated to a Cname when you look at it. Under the Domain it has two entries one to the subdomain and the other to the website host each with separate ip addresses. Is this correct? The website people are too busy on another project to research it further and my friends haven't gotten back to me. Please help. Thanks KK

    Read the article

  • How to write a blog for SEO purpose

    - by Mathieu Imbert
    I have a photo sharing website, which provides very little textual content. Users can add tags to photos and a description, but it creates a lot of duplicate content, because most of the descriptions will be 'wow', 'lol', ... I don't think I should rely on users to build my SEO. I think it would be a great idea to write a blog, and use it to describe the best photos, start contests, explain themes, in short: create original content that search engines will love. Our website's main URL is like www.domain.com, and our new blog is hosted on blog.domain.com. From a SEO perspective, is it a good idea to keep the blog separate from the main site? This has the advantage to leave the original site unchanged, but will it add any page rank to the www.domain.com? If the blog ranks well it will obviously pass some page rank to the original through links. What do you think is the best option from a SEO perspective? Include the blog in www.domain.com? Or leave it in blog.domain.com?

    Read the article

  • One site being on a subdirectory of another. Does google count this against you?

    - by Mick
    I have created two similar websites (relating to monetary systems). So far, one appears to be loved by Google and the other hated. I'm struggling to work out why. This is a mystery to me because both sites were created by me with the same design philosophy, both in pure html. Both are packed to the rafters with references to, and information about, their respective subjects. One issue I'm worried may be the cause is to do with the location of the sites. I got a web hosting package from hostmonster.com for the successful one, but less liked one is just an "add-on" which sits on a subdirectory of the successful one. I wonder if Google somehow detects this and treats it as a less significant website? EDIT: Just to clarify, even though one site is an add-on that sits on a subdirectory of the other, the URL is arranged to look like it is a root. I.e. the unpopular site can be accessed directly with a simple www.myunpopularsite.com name, without specifying any subdirectory. EDIT: Just in case its important... say the popular site is called pop.com and the unpopular one unpop.com. In the webspace I've purchased, there is a directory called public_html. This is where I put the index.htm and all the other files of my popular site. When I purchased the add-on unpop.com. I made a subdirectory of public_html called unpop. It is within this "public_html\unpop\" that I place the index.htm and all the other files of my unpopular site. Typing www.unpop.com into the address bar of a browser links directly to the contents of "public_html\unpop\" and the user is not aware that this site is sitting on a subdirectory of another site. BUT if you type "www.pop.com/unpop" into the address bar of a browser you DO see the unpopular site.

    Read the article

  • Snow Leopard connecting to Unbuntu 10.04 through Samba failure -- need help fixing.

    - by Chris Altman
    I have a Ubuntu 10.04 web server. I want to connect to it with my OSX 10.6 machine and Finder. I have installed openSSH and Samba on the Ubuntu machine. In my smb.conf I have a Share Definition: [www] comment = Development Computer WWW path = /var/www writeable = yes browseable = yes allow hosts = 192.168.1. I can connect to the machine through Finder using a non-root user. When I attempt to add files thought Finder I get an "Insufficient Permissions" error. Please help. I am not sure if the issue is in the Samba configuration or OSX 10.6 Thank you

    Read the article

  • Why is group write permission ignored in Ubuntu?

    - by NorthPole
    I want my user to have full access to the local Apache root folder, and I also want the Apache user to have full access to the same folder. What I did was create a new group called DevGroup and I added my user and www-data there. Also I changed the permissions to 770 to allow full group access. But now it won't allow me or the Apache user any kind of access to the folder. Here is what I get with ls: drwxrwx--- 12 root DevGroup 4096 Sep 27 17:34 testFolder Which seems perfect but when I try as a user to access the file I get this: var/www$ ls testFolder/ ls: cannot open directory testFolder/: Permission denied Also when I try to access a page in the folder from a browser: [Thu Sep 27 17:47:16 2012] [error] [client 127.0.0.1] PHP Fatal error: Unknown: Failed opening required '/var/www/testFolder/foo.php' (include_path='.:/usr/share/php:/usr/share/pear') in Unknown on line 0 What's the problem and how can I fix it?

    Read the article

  • Is defragging right for me?

    - by blade
    Hi, I am using Hyper-V on my Windows Server 2008 R2 DC x64 machine. I am also using standard SATA drives. I read some threads on here about defraging but could not reach a conclusion about whether or not I should use defragging. Can anyone shed some light on whether this will be right for me? Furthermore, what tool is best? There seems to be 3: http://www.perfectdisk.com/products/business-perfectdisk11-server/key-features http://www.diskeeper.com/diskeeper/home/server-edition.aspx?id=40279&wid=7 http://www.perfectdisk.com/products/business-perfectdisk11-hyper-v/learn-more Anyone have experience with this?

    Read the article

  • Not allowed to upload .HTML files to my own DNN Site: Is it normal?

    - by Jake M
    My Question: Our webhost provider wont make it so we can upload .html files to our DNN site trhough the DNN File Manager page. Is that normal, should I push them to allow me to do this? We have recently transferred our website to a Dot Net Nuke run website. We originally had our website on a Linux server with Python scripts handling the backend. Obviously we now have a Windows server running .NET with ASP .NET code on the backend. Our webhost is a local Australian company. And they are saying we cant upload any .html files to the main part of the server, ie, www.ourdomain.com/Portals/0/. They are saying that the only place I can upload .html files is via FTP to this folder *www.ourdomain.com/Portals/0/html_content* This is a major problem for me because I am trying to upload my own skin which means I need to upload a main.html file to www.ourdomain.com/Portals/0/skins/myskin/ but they wont let me?! I guess what I am asking is, is this normal practice, why would they not allow this? As an experienced web admin for Linux servers and as someone who is used to being able to do whatever I want on my OWN server this is someing that really pis$%s me off!

    Read the article

  • Linux file permissions seem right but I can't write to a directory

    - by CaseyB
    I believe that I have the permissions set correctly but I can't write to a directory. Here's my problem: cborders@Kraken:/var/www$ ls -la total 12 drwxrwxr-x 2 webz webz 4096 2011-12-30 14:58 ./ drwxr-xr-x 13 root root 4096 2011-12-30 14:58 ../ -rw-rw-r-- 1 webz webz 177 2011-12-30 14:58 index.html cborders@Kraken:/var/www$ id cborders uid=1000(cborders) gid=1000(cborders) groups=1000(cborders),4(adm),20(dialout),24(cdrom),46(plugdev),109(sambashare),113(lpadmin),114(admin),1002(webz) cborders@Kraken:/var/www$ mkdir test mkdir: cannot create directory `test': Permission denied The owner of the directory is a user called webz and the permissions allow the user and group rwx access to it. I am in the webz group but I still can't make any changes. What am I doing wrong here?

    Read the article

  • Virtualhost Wildcard Subdomains

    - by Khuram
    We have one static IP on which we have routed our company website. We have setup a local machine on windows with WAMP to run our testing server. We want virtual hosts to test our different apps. However, when creating subdomains, we have a new project which uses wildcard subdomains. How can we create the wildcard subdomains in VirtualHosts. We use, NameVirtualHost * <VirtualHost *> ServerAdmin admin@test DocumentRoot "E:/Wamp/www/corporate" ServerName companysite.com </VirtualHost> <VirtualHost *> ServerAdmin admin@test DocumentRoot "E:/Wamp/www/project" ServerName project.companysite.com </VirtualHost> <VirtualHost *> ServerAdmin admin@test DocumentRoot "E:/Wamp/www/project" ServerName *.project.companysite.com </VirtualHost> However, the last * wildcard does not work. Any help?

    Read the article

  • Update Google Sitemap for Mobile

    - by dimo414
    I have a series of utilities to generate Google sitemaps for my whole site. These files are massive, and slow to build. We want to start telling Google these pages are mobile-crawl-able too, by adding them to mobile sitemaps, but the documentation is unclear if I need to specify physically different files for my mobile URLs than for my normal ones. If this is my current sitemap: <?xml version="1.0" encoding="UTF-8" ?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"> <url> <loc>http://mobile.example.com/article100.html</loc> </url> </urlset> Can I simply change it to: <?xml version="1.0" encoding="UTF-8" ?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:mobile="http://www.google.com/schemas/sitemap-mobile/1.0"> <url> <loc>http://mobile.example.com/article100.html</loc> <mobile:mobile/> </url> </urlset> Or do I need to create new files with the additional markup, alongside my existing files?

    Read the article

  • make sure i've configured dns correctly

    - by user22817
    Hi, On my server with Windows Server 2008, i've added a new zone with my new domain: biografica.ro. I've added a name server for it: ns.biografica.ro and the ip of the server. Also i added two A hosts for ns.biografica.ro and www.biografica.ro. I've also defined first that name server on the site where i bought the domain and pointed it to my ip. Now, i understood that i have to wait 24-48 hours for this new domain to be available online. Question: - is this a way to check on the server that i've succesfully configured it? For instance: nslookup biografica.ro or nslookup www.biografica.ro or ping www.biografica.ro do not work locally, on the server... Can you give me an advice? Thanks.

    Read the article

  • ArchBeat Link-o-Rama for 2012-04-12

    - by Bob Rhubart
    2012 Real World Performance Tour Dates |Performance Tuning | Performance Engineering www.ioug.org Coming to your town: a full day of real world database performance with Tom Kyte, Andrew Holdsworth, and Graham Wood. Rochester, NY - March 8 Los Angeles, CA - April 30 Orange County, CA - May 1 Redwood Shores, CA - May 3 Oracle Technology Network Developer Day: MySQL - New York www.oracle.com Wednesday, May 02, 2012 8:00 AM – 4:30 PM Grand Hyatt New York 109 East 42nd Street, Grand Central Terminal New York, NY 10017 Webcast Series: Data Warehousing Best Practices event.on24.com April 19, 2012 - Best Practices for Workload Management of a Data Warehouse on Oracle Exadata May 10, 2012 - Best Practices for Extreme Data Warehouse Performance on Oracle Exadata Webcast: Untangle Your Business with Oracle Unified SOA and Data Integration event.on24.com Date: Tuesday, April 24, 2012 Time: 10:00 AM PT / 1:00 PM ET Speakers: Mala Narasimharajan - Senior Product Marketing Manager, Oracle Data Integration, Oracle Bruce Tierney - Director of Product Marketing, Oracle SOA Suite, Oracle The Increasing Focus on Architecture (ArchBeat) blogs.oracle.com As a "third wave" of computing, Cloud computing is changing how IT organizations and individuals within those organizations approach the creation of solutions. Updated SOA Documents now available in ITSO Reference Library blogs.oracle.com Nine updated documents have just been added to the IT Strategies from Oracle library, including SOA Practitioner Guides, SOA Reference Architectures, and SOA White Papers and Data Sheets. Access to all documents within the ITSO library is free to those with a free Oracle.com membership. WebLogic JMS Clustering and Spring | Rene van Wijk middlewaremagic.com Oracle ACE Rene van Wijk sets up a WebLogic cluster that includes a JMS environment, which will be used by Spring. Running Built-In Test Simulator with SOA Suite Healthcare 11g in PS4 and PS5 | Shub Lahiri blogs.oracle.com Shub Lahiri shows how the pre-installed simulator that comes with the SOA Suite for Healthcare Integration pack can be used as an external endpoint to generate inbound and outbound HL7 traffic on specified MLLP ports. In the cloud era, let's start calling IT what it is: 'Innovation Team' | Joe McKendrick www.zdnet.com Cloud, the third great shift in 50 years of computing, presents a golden opportunity for IT to get out in front and lead. Thought for the Day "Why do we never have time to do it right, but always have time to do it over?" — Anonymous

    Read the article

< Previous Page | 115 116 117 118 119 120 121 122 123 124 125 126  | Next Page >