Search Results

Search found 22762 results on 911 pages for 'wcf client'.

Page 656/911 | < Previous Page | 652 653 654 655 656 657 658 659 660 661 662 663  | Next Page >

  • Hub Forum - Connecting Digital Influencers - Paris 10 & 11 Octobre 2013

    - by Louisa Aggoune
    ORACLE a sponsorisé la 4ème édition du HUB FORUM qui s'est déroulé à l'Espace Pierre Cardin. Les 10,11 oct 2013, plus de 1200 leaders du digital, de la communication, du marketing, de la publicité et de l'innovation se réunissaient pour 2 jours de Conférence lors de la 4ème édition du HUBFORUM Paris, organisé par le HUB Institute.C'est l'évènement qui rassemble les décideurs du monde du digital et leur propose de rencontrer les meilleurs experts en communication du moment mais aussi d’échanger autour des pratiques qui ont fait leurs preuves. Il offre une occasion exceptionnelle de parler du futur du marketing digital, des nouvelles technologies et des médias. Le HUB FORUM en chiffre c'est:- 18 580 mentions- 3 520 vues du live- 80 speakers- 1 200 participants Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} A cette occasion, Pascal Hary - Directeur du Développement des Ventes eXperience Client & Social Europe, a animé une conférence sur le thème  #Social Trend 2014. Pour découvrir l'album photo: https://www.facebook.com/hubforum

    Read the article

  • Existing laravel 4 project gives 404 in browser

    - by Richard A
    I'm trying to set up a development environment on a virtual machine running Ubuntu 14.04 LTS using Nginx and HHVM. To do this, I followed the tutorial here. This goes well with a new installation of Laravel. But when I import an existing Laravel 4 project and try to open that on my actual machine (which will serve as the client running Windows 7), I'm getting a 404 File Not Found error on the screen while connecting to http://sav.savrichard.dev. I did add this to the hosts file with the correct IP Address. The virtual machine is receiving the request and responds with a 404 error. How do I solve this error? I'm pretty new to Ubuntu so I'm not exactly sure what's wrong. The project is located at /var/www/sav.savrichard.net The server configuration is as follow: server { listen 80 default_server; root /var/www/sav.savrichard.net/public; index index.html index.htm index.php; server_name sav.savrichard.dev; access_log /var/log/nginx/localhost.sav.savrichard.dev-access.log; error_log /var/log/nginx/localhost.sav.savrichard.dev-error.log error; charset utf-8; location / { try_files \$uri \$uri/ /index.php?\$query_string; } location = /favicon.ico { log_not_found off; access_log off; } location = /robots.txt { log_not_found off; access_log off; } error_page 404 /index.php; include hhvm.conf; # Deny .htaccess file access location ~ /\.ht { deny all; } } And the hhvm.conf file is: location ~ \.(hh|php)$ { fastcgi_keep_conn on; fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; include fastcgi_params; }

    Read the article

  • OpenWorld 2011 Video Index

    - by Chris Kawalek
    We did quite a few virtualization videos this year at Oracle OpenWorld 2011. You can find all these and more on our YouTube channel. Virtualization Wrapup Adam Hawley discusses the Oracle virtualization presence at Oracle OpenWorld 2011. http://www.youtube.com/oraclevirtualization#p/f/2/53_SQYljqN4 Oracle Applications on iPad Brad Lackey shows how you can access Oracle Applications on iPad. http://www.youtube.com/oraclevirtualization#p/f/9/3Ug5km3uxEQ Thinkquest.org and Oracle VM Dan Herrup describes how Thinkquest.org is using Oracle VM to help kids learn how to solve real world problems with computer technology. http://www.youtube.com/oraclevirtualization#p/f/6/Bw-km5kqzEo Avaya and Oracle Virtualization See Oracle desktop virtualization in action at Avaya's booth. http://www.youtube.com/oraclevirtualization#p/f/4/xIHRIijEPkM Eco-Features of Sun Ray Clients Michael Dann shows off the Sun Ray 3 Plus and talks about the eco benefits of Oracle's extremely low power consumption client device for desktop virtualization. http://www.youtube.com/oraclevirtualization#p/f/3/ulArHGe1OmM Application and Desktop Access with Oracle Secure Global Desktop Watch Jeff Harvey do a quick demo of Oracle Secure Global Desktop accessing Oracle Applications. http://www.youtube.com/oraclevirtualization#p/f/5/g_ikA7dwh0g Oracle VM VirtualBox for VDI Andy Hall describes how enterprises leverage Oracle VM VirtualBox as part of their VDI deployments. http://www.youtube.com/oraclevirtualization#p/f/8/WmkeYlzgnZ8 TechCast Live: The Coolest Virtualization Products Interview with Andy Hall about the desktop virtualization portfolio. http://www.youtube.com/oraclevirtualization#p/f/7/VMkrAhZ83AA

    Read the article

  • How to connect to my US network overseas via VPN?

    - by GiH
    I purchased an Apple TV for my parents and I have a netflix account. My parents live overseas, and I was wondering if they could use my account to get it to work. I read that it won't work unless you use proxies or a VPN, so I was wondering if its possible for me to setup a VPN to my network in the US instead of paying a service like StrongVPN? Setup: Router in US - Airport Extreme Router abroad - D-link (not sure of model) I know that the AppleTV doesn't have a built-in VPN client, maybe eventually when its jailbroken there will be an app, but as of now I'll have to use the routers right? Any other ideas are welcome as well!

    Read the article

  • What is fastest way to backup a disk image over LAN?

    - by David Balažic
    Sometimes I boot sysrescd or a similar live linux on a PC to backup the hardrive over local network to my server. I noticed many times, that the transfer speed is not optimal (slower than HDD and network speed). Any rules of thumb what to do and what to avoid? What I typically do is something like: dd bs=16M if=/dev/sda | nc ... # on client nc ... | dd bs=16M of=/destination/disk/backup1 # on server I also "throw" in lzop (other are way too slow) and sometimes on the fly md5sum calculation (both of uncompressed and compress source). I try to add (m)buffer (or other alternatives) to improve throughput (and get a progress indicator). I noticed that even with enough free CPU, adding commands to the pipeline slows things down. Typically the destination is on a NTFS volume (accessed via ntfs-3g, with the _big_writes_ option).

    Read the article

  • Terminal Server 2003 Performance Troubleshooting

    - by MikeM
    Let me get your thoughts on terminal server performance problems. The server hosts average 25 users which, after running some numbers, on average use 600MB memory with their main applications running (web browser, adobe reader, IP phone client). All users are on the same LAN as server. We constantly experience slow response and short session lockups. Combined CPU usage is on average 10%. What appears strange to me is that the system shows 29GB physical memory with 25GB of it free. The page file usage is about 50% averaging 9GB used. Some server specs OS: Server 2003 32bit Enterprise with /PAE flag RAM: 32GB CPU: 2xQuad Core @ 2.27Ghz HD: RAID5 1.2GB After doing basic troubleshooting using performance monitor it leads me to believe that the performance problems are caused by the 32bit OS limitation in addressing full 32GB of physical memory even though the /PAE flag is used. Can anyone suggest something, troubleshooting steps that can lead to a more conclusive answer? Thanks

    Read the article

  • How much overhead is there in persistent connections?

    - by nynex
    Ok so I'm musing over a little side project I want to start. Essentially its a multi-session web based FTP client. Multi-session in that you can log into several FTP servers at the same time and perform operations like moving a file from one FTP server to another. I'm doing this mainly to brush up on the new webdev technologies, particularly websockets. I'm using node.js + socket.io to keep a persistent bi-directional connection between the web browser and the web server. The web server will also have persistent connections to each FTP server the user has logged into. So if there are 100 concurrent users each logged into 5 ftp accounts, the web server will have 100 websocket connections + 500 ftp connections. Is servicing 600 connections a lot? I know it depends on the hardware resources of the server but is something like this doable on a budget? Are there more efficient means of doing something like this? I know its unlikely that this project will really get popular but I want it to scale well regardless. Thanks for any help, I've still got a lot to learn.

    Read the article

  • Lync Server 2010 with Hosted VoIP PBX

    - by kmehta
    We just deployed Lync Server 2010 in our organization and it is working great so far. The next step for us is to enable enterprise voice so that we can replace our telephones with service that is handled 100% by Lync. This is where I am at a loss. I have a fully deployed Standard Edition Lync server and a hosted VoIP PBX provider with VoIP handsets. I would like to get rid of the handsets and have my company's phone service be handled by Lync client (e.g. someone calls my work number, and Lync rings instead of my old handset that is set up with the PBX) I am new to deploying these types of features. Any help is appreciated. Thanks.

    Read the article

  • ODBC in SSIS 2012

    - by jamiet
    In August 2011 the SQL Server client team published a blog post entitled Microsoft is Aligning with ODBC for Native Relational Data Access in which they basically said "OLE DB is the past, ODBC is the future. Deal with it.". From that blog post:We encourage you to adopt ODBC in the development of your new and future versions of your application. You don’t need to change your existing applications using OLE DB, as they will continue to be supported on Denali throughout its lifecycle. While this gives you a large window of opportunity for changing your applications before the deprecation goes into effect, you may want to consider migrating those applications to ODBC as a part of your future roadmap.I recently undertook a project using SSIS2012 and heeded that advice by opting to use ODBC Connection Managers rather than OLE DB Connection Managers. Unfortunately my finding was that the ODBC Connection Manager is not yet ready for primetime use in SSIS 2012. The main issue I found was that you can't populate an Object variable with a recordset when using an Execute SQL Task connecting to an ODBC data source; any attempt to do so will result in an error:"Disconnected recordsets are not available from ODBC connections." I have filed a bug on Connect at ODBC Connection Manager does not have same funcitonality as OLE DB. For this reason I strongly recommend that you don't make the move to ODBC Connection Managers in SSIS just yet - best to wait for the next version of SSIS before doing that.I found another couple of issues with the ODBC Connection Manager that are worth keeping in mind:It doesn't recognise System Data Source Names (DSNs), only User DSNs (bug filed at ODBC System DSNs are not available in the ODBC Connection Manager)  UPDATE: According to a comment on that Connect item this may only be a problem on 64bit.In the OLE DB Connection Manager parameter ordinals are 0-based, in the ODBC Connection Manager they are 1-based (oh I just can't wait for the upgrade mess that ensues from this one!!!)You have been warned!@jamiet

    Read the article

  • Connect through SSH and type in password automatically, without using a public key

    - by binary255
    A server allows SSH connections, but not using public key authentication. It's not within my power to change this at the moment (due to technical difficulties, not organizational) but I will get on it as soon as possible! What I need now is to execute commands on the server using plain old account+password authentication from a script. That is, I need to do it in a non-interactive way. Is it possible? And how do I do it? The client which will be executing the script runs Ubuntu Server 8.04. The server runs Cygwin and OpenSSH.

    Read the article

  • How can I avoid heroku stopping my dyno?

    - by iwein
    I build MVP's for clients regularly. Often I deploy on Heroku so they can see if the product works and demo it to prospects and investors. Then I have an application deployed on heroku, and it works like a charm, if not for one little thing. The app takes about 30 seconds to start up and heroku has the annoying habit of killing dyno's if they don't get traffic. My client is using the application for demo purposes now, so the load is extremely low and intermittent. I'm looking for a solution that is preferably: cost effective can be applied to multiple apps simultaneously What is the best way to avoid having the first request taking 30 seconds?

    Read the article

  • Help on using mod_rewrite to serve I18N static site

    - by Guandalino
    My static site www.example.com is translated in different languages and files are organized in this hierarchy: / /de index.html seite-1.html /en index.html page-1.html /it index.html pagina-1.html The root contains no files, just one subdirectory for each language the site is translated in, while subdirectories contain pages translated (both content and file name are) in the language corresponding to subdirectory name, de, en, it, etc. The question is: how to configure mod_rewrite so that when a client visits www.example.com it is taken to the correct version of the site, falling back to english version if the required locale is not supported (i.e. Accept-Language header doesn't exist or specifies a language for which the site is not available, e.g. fr)? Thanks for any pointer, I'm here to provide further details or feedback! Best regards

    Read the article

  • MongoDB REST interface not listening after update

    - by Ones and Zeroes
    I replaced the mongodb-10gen install with the Ubuntu package (mongodb-server, mongodb-client and dev). apt-get install mongodb Thereafter, I am now unable to connect to the REST interface, where it worked before. Doing a wget to http://127.0.0.1:27018, I receive the following response: Connecting to 127.0.0.1:27018... failed: Connection refused. My previous /etc/mongodb.conf file had the following in: #enable REST rest = true Adding it to the packaged conf file does not resolve the issue, not even after restarting. I also tried changing the following with no effect: # Disable the HTTP interface (Defaults to localhost:27018). # nohttpinterface = true to # Disable the HTTP interface (Defaults to localhost:27018). nohttpinterface = false I have searched for days, and there doesn't seem to be anything on the Mongo site about a similar anomaly. If you have encountered a similar issue on Ubuntu Oneiric, please add your comments, even if you haven't found a solution to this issue.

    Read the article

  • Should one reject over-scoped projects?

    - by Little Child
    I spoke to my first potential client today and he told me about the requirements of his project - an Android app. He is a well-known designer / photographer in my country and now wants me to "convert the website into an app, custom-tailored". So the requirements, details stripped out, are as follows: eCommerce Aggregating all his content like videos, blogs, tweets, etc. into the app Live streaming any of his studio demos Augmented reality. So that people can see what his painting will look like on their wall before they buy it Taxi Sharing Now, for a freelance project, it seems too over-scoped. I am not saying that I cannot do it. I can. But let me be realistic: There is a steep learning curve when it comes to VR. I am not a tester. I have never white-box tested my own apps. I always black-box test. Since he is a renowned artist, something short of perfect might harm his public image So, I asked him for 2 weeks' worth of time before I give him the final answer. Now knowing whom to consult for advise, I am posting the question here. Although interesting and personally challenging, I am split-minded about accepting a project like this. I will be the only developer for this. Should one reject a project that seems to be over-scoped for one's own abilities?

    Read the article

  • Remote X-windows between new RHEL5 and old Solaris 8

    - by joshxdr
    I have a very small lab network with three boxes: a modern x86-based RHEL3 box, an x86-based RHEL5 box, and a 1998-vintage SPARC Ultra5 with Solaris 8. I can use ssh -X to run a program on the RHEL5 box and view the windows on the RHEL3 box. I believe this uses xauth and magic cookies?? I have followed the X-Windows HOWTO to set up xauth on the Solaris box, but so far no dice. I would like to be able to use the X-windows server on the RHEL3 box with a client program on the Solaris box (program running on Solaris host, windows appearing at Linux host). Is there a trick to this, or have I made a mistake following the instructions for setting up xauth and magic cookie?

    Read the article

  • 64kb limit on the size of MSMQ Multicast Messages

    - by John Breakwell
    When Windows 2003 came out, Microsoft introduced the ability to broadcast messages to any machines that were listening back. All you had to do was send out a message on a particular port and IP address and any client that had set up a Multicast queue with matching port and IP address would get a copy. Since its introduction, there have been a couple of security vulnerabilities that needed to be removed: Microsoft Security Bulletin MS06-052 Vulnerability in Pragmatic General Multicast (PGM) Could Allow Remote Code Execution (919007) Microsoft Security Bulletin MS08-036 Vulnerabilities in Pragmatic General Multicast (PGM) could allow denial of service (950762) The second of these, MS08-036, was resolved through an undocumented change in functionality. Basically, a limit of 64kb was put on the maximum size of a message that could be broadcast using the Multicast method. Obviously this has caused a few problems for any existing MSMQ Multicast applications that expected to be able to send larger messages. A hotfix has been developed to resolve this problem. 961605 FIX: Multicast messages larger than 64 kilobytes (KB) are not delivered as expected by using Message Queuing 3.0 after security update MS08-036 is installed A registry change is required: Open the registry with Regedit Navigate to HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\RMCAST\Parameters\ Create a DWord called MaxpacketSize Set the value to the desired number of bytes. You can set it to a value between zero and 4MB. If you specify anything above 4MB, it will default to 64K. A reboot is needed after adding this value.

    Read the article

  • How to do a 3-tier using PHP [closed]

    - by Ric
    I have a requirement from a client for my PHP Web application to be 3-tier. For example, I would have a web server on Apache in the DMZ, but it should NOT contain any DB connections. It should connect to a Middle server that would host the business objects but be behind the firewall. Then those objects connect to my SQL cluster on another server. I have actually done this using .NET, but I am not sure how to setup my stack using PHP. I suppose I could have my UI front tier call the middle tier using REST based web services if I create my middle tier as a second web server, but this seems overly complex. The main reason for this is advanced security: we can not have any passwords on the DMZ first tier web server. The second reason is scalability - to have multiple server on different tiers that can handle the requests. The Last reason is for deployment - it is easier if I can take one set of servers offline for testing before putting them back in production. Is there a open source project that shows how to do this? The only example I can find is the web server hosting files from a shared drive on another machine (kind of how DotNetNuke pretends to be 3-tier), but that is NOT secure.

    Read the article

  • Will NTP work in an isolated network ( in absence of a reliable time source)

    - by Anand
    Hi, I am investigatiing a typical NTP problem. The setup is as follows :- FreeBSD is being compiled and run on Opensolaris. The config file on OpenSolaris has entry of linux and another opensolaris machine as server and these server machines are syncing time with themselves (local clock) only. The server machines in this case have NTP running on them as well. Within a few minutes of starting the ntp daemon ,client starts syncing time with itself only and remains in that situation after that.All servers are discarded and no time syncing is done with them. My question is , is there any fundamental problem with this setup. Will the NTP work in such isloated network that has no direct or indirect connection with reliable internet time source ?

    Read the article

  • Suggestion for setting web application parameters

    - by user40730
    I'm creating a web application on GWT. I'm using MVP pattern with activities and places. I have a xml config file containing some parameters to be used by the application. Content of this xml file is sent to the client using HttpRequest; I'm using a singleton class to hold the information from the xml file. Right now, the application is getting the data when the user starts the application in the home page, that is working well. Now, since I'm using activities and places, a user can bookmark a page and starts the application in any other page (Place). And here comes the problem: Since I'm using some of the information from the xml file to set some ui widgets, I have to check if the xml config file was read and the application already has the parameters (I do this by checking the singleton class). But the xml file is read by using an HttpRequest, so I got errors 'cause the application needs some parameters to initialize some ui widgets, but these parameters aren't ready on time. I was thinking on using an synchronous request to fix the problem, but it seems complicated and not recommendable to do that. So, I'd like to hear some other suggestions. Thanks.

    Read the article

  • ssh: which side is running the SOCKS proxy?

    - by Barry Brown
    When I set up a tunnel using dynamic forwarding (ssh -D), which side is running the SOCKS proxy? That is, is the proxy running on the local end (client) or the remote end (server)? Here's the situation: I want to set up several tunnels chained together using -L. Should the -D tunnel be the last one in the chain or the first one? Edit: I found the answer to the second paragraph on Super User (the -D tunnel should be at the remotest end). But I'd still like to know where the proxy code is running.

    Read the article

  • Best Method/Library For Remote Authentication

    - by Mike
    I have a web app that has a REST API interface: http://api.example.com/core that uses API Keys and domain specific keys (key has to be used on the specified domain). I then will have several client sites with ajax forms where we will require users to sign in before being able to submit the form. This form will add data to a table, and submit an email to several recipients along with checking credentials. This form will use an ajax submit to our REST API. All Communication to/from the API is over SSL Ideal Flow: Visitor Fills Form Out -> Enters User/pass -> Submits Form -> ajax request to REST API -> API Verifies credentials -> does CRUD -> sends emails -> returns 200/403 -> perform DOM manipulation based on return code in ajax call Are there any libraries in PHP that currently do something to this similarly? Would OAuth be a good use for this scenario? Languages used are: js/html/css/php/MySQL

    Read the article

  • Home PBX to answer/take external calls via PSTN

    - by ageis23
    I have a Thomson 585V6 router which has built in voip support. I want to be able to use a softphone to make calls. for example phone my dad's mobile. Any incomming calls to my normal bt number should be taken via my pc as well. What I have done so far: I've wired the pstn port on the router to the telephone jack. The router is connected to my pc. I have installed asterisk on the pc I want to take calls on. The sip client authenticates to the sip server. output from twinkle: Sun 22:46:45 home, registration succeeded (expires = 3600 seconds) how do I take external calls/ answer incoming calls from pstn?

    Read the article

  • email archive for multiple users

    - by evanmcd
    Hi, I'm moving a web site from one server to another, and am realizing that I need to move the name servers for the domain as well (they are set to the current host, not to the registrar). So, knowing that email services will stop as soon as I switch the DNS, I'm scrambling to figure out how to archive and make available email data for folks that have mostly been using webmail for the past few years, and may not even have a computer on which to install a client to download the mail to. What does one do in this situation? Thanks for any help offered! Evan

    Read the article

  • How to configure a new subdomain for a wildcard certificate?

    - by Amit
    Hi, We have wildcard certificate installed in our production environment. One of our client wants his name to appear in the URL (e.g. companyname.example.com). How we should facilitate this? Do we need to make any entries for this in DNS? If yes can you please let me know about it? I need to set this up before Fridat PST, any help in this is highly appriciated. Thanks.

    Read the article

  • How to setup dhcp3-server to advertise the DNS server the server itself has got from DHCP?

    - by Ivan
    The Ubuntu 10.04 server has eth0 Internet interface configured by means of an ISP's DHCP. At the same time the server has static eth0 LAN interface to which it provides masquerading (NAT) and LAN-internal DHCP service (dhcp3-server). As far as I've understood the manual, I had to hardcode DNS servers to advertise through LAN DHCP with option domain-name-servers in dhcpd.conf. But what if the ISP changes his DNS server IP silently (we use a SOHO-class ISP, so this won't surprise me much)? Can I configure dhcpd to advertise the DNS server the server uses itself, the one gotten by its DHCP client mechanism?

    Read the article

< Previous Page | 652 653 654 655 656 657 658 659 660 661 662 663  | Next Page >