Search Results

Search found 69987 results on 2800 pages for 'wcf data services'.

Page 232/2800 | < Previous Page | 228 229 230 231 232 233 234 235 236 237 238 239  | Next Page >

  • Mavericks: Safari does not login in into web services

    - by Roberto
    Since when I upgraded ML to Mavericks Safari is no longer able to log me into Facebook. When I go to the login page it suggests me the correct credentials, I hit the Login button, the page refreshes but nothing happens, like if the credentials where empty. Firefox works perfectly, I even logged out and back in to make sure the credentials are the same that Safari suggests, and so they are. Needless to say for a different user on the same Mavericks Safari logs in correctly. The same happens with most web pages that need a login, web mails for instances, I have tow accounts on different webmail providers and none of them works. Of course using the same mail services with POP3 works fine. Even on this very site I cannot post a thing with Safari, I'm going to switch to Firefox to be able to post this question. Again, Firefox or a different user are OK. Do you have any idea/suggestion?

    Read the article

  • Which has a faster data transfer rate? WIFI (tablet or cell phone, not LTE) or MicroSD (Class 10)?

    - by techaddict
    Which of the two methods of dta transfer trasfers data at a faster rate for smartphones and tablets? Standard WIFI, or MicroSD Cards? I wonder if it would be actually faster to access data on external storage then it would be to have the MicroSD card in my smartphone or tablet. Currently I have a class 10 32GB MicroSD card in my cell phone. I am looking to get the new Google Nexus tablet but it does not offer expandable internal storage. I wonder if that's really a detriment; because if WIFI is faster than MicroSD, then it would matter almost none at all that you couldn't expand the storage internally. If the case is that WIFI is faster, and people caught onto this, then people could save a lot of money on lower memory ipads/iphones/ipods, tablets, and smartphones!

    Read the article

  • ASP.NET MVC & Windsor.Castle: working with HttpContext-dependent services

    - by Igor Brejc
    I have several dependency injection services which are dependent on stuff like HTTP context. Right now I'm configuring them as singletons the Windsor container in the Application_Start handler, which is obviously a problem for such services. What is the best way to handle this? I'm considering making them transient and then releasing them after each HTTP request. But what is the best way/place to inject the HTTP context into them? Controller factory or somewhere else?

    Read the article

  • Google I/O 2010 - Batch data processing with App Engine

    Google I/O 2010 - Batch data processing with App Engine Google I/O 2010 - Batch data processing with App Engine App Engine 201 Mike Aizatsky In this session, attendees will learn how to write map() functions, how to do simple reduce() operations, how to run these over large datasets, and how App Engine is used to accomplish such parallelism. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 6 0 ratings Time: 38:45 More in Science & Technology

    Read the article

  • ASP.NET web site running in IIS and hosting WCF service fails to get connections on the TCP server

    - by Salil
    I am using the combination of Silverlight client application along with ASP.NET web site running in IIS and hosting WCF service. This WCF service uses the library that starts a TCP server and and initiates requests to the connected TCP clients when the silverlight client application makes the WCF async requests. When I use this library in a local WPF application, the TCP server is able to receive client connection requests and I can get info from these clients. But when I use the same library from the implementation of the WCF service inside the ASP .NET web site project (+ Silverlight client), the server strangely does not receive any connection requests i.e. when I create TcpListener object and issue a start, nothing happens (nor an exception is generated). My setup is I am using the Ethernet for the Internet and Wi-Fi for the TCP clients. Is the WCF service getting confused because of this? Is there any special WCF settings I should put in for TcpListener.Start to work?

    Read the article

  • SQL Server 2012 disponible en version finale : AlwaysOn, Big Data, Power View, Microsoft tient ses promesses

    SQL Server 2012 disponible en version finale AlwaysOn, Big Data, Power View, la plateforme de gestion et d'analyse d'information de Microsoft tient ses promesses Mise à jour du 03/04/2012 Comme l'avait promis Microsoft, la version finale de SQL Server 2012 est disponible depuis le 1er avril, mais a été annoncée officiellement hier. La plateforme de gestion et d'analyse d'information de Microsoft a été conçue pour être l'environnement de référence des applications critiques d'entreprise, offrir une solution décisionnelle plus complète intégrant le Big Data et permettre une meilleure connexion avec le Cloud. ...

    Read the article

  • Let&rsquo;s keep informed with &ldquo;Data Explorer&rdquo;

    - by Luca Zavarella
    At Pass Summit 2011 a new project was announced. It’s a Microsoft SQL Azure Lab and its codename is Microsoft “Data Explorer”. According to the official blog (http://blogs.msdn.com/b/dataexplorer/), this new tool provides an innovative way to acquire new knowledge from the data that interest you. In a nutshell, Data Explorer allows you to combine data from multiple sources, to publish and share the result. In addition, you can generate data streams in the RESTful open format (Open Data Protocol), and they can then be used by other applications. Nonetheless we can still use Excel or PowerPivot to analyze the results. Sources can be varied: Excel spreadsheets, text files, databases, Windows Azure Marketplace, etc.. For those who are not familiar with this resource, I strongly suggest you to keep an eye on the data services available to the Marketplace: https://datamarket.azure.com/browse/Data To tell the truth, as I read the above blog post, I was tempted to think of the Data Explorer as a "SSIS on Azure" addressed to the Power User. In fact, reading the response from Tim Mallalieu (Group Program Manager of Data Explorer) to the comment made to his post, I had a positive response to my first impression: “…we originally thinking of ourselves as Self-Service ETL. As we talked to more folks and started partnering with other teams we realized that would be an area that we can add value but that there were more opportunities emerging.” The typical operations of the ETL phase ( processing and organization of data in different formats) can be obtained thanks to Data Explorer Mashup. This is an image of the tool: The flexibility in the manipulation of information is given by Data Explorer Formula Language. This is a formula-based Excel-style specific language: Anyone wishing to know more can check the project page in addition to aforementioned blog: http://www.microsoft.com/en-us/sqlazurelabs/labs/dataexplorer.aspx In light of this new project, there is no doubt about the intention of Microsoft to get closer and closer to the Power User, providing him flexible and very easy to use tools for data analysis. The prime example of this is PowerPivot. The question that remains is always the same: having in a company more Power User will implicitly mean having different data models representing the same reality. But this would inevitably lead to anarchical data management... What do you think about that?

    Read the article

  • Suggestions for hosted file sharing services

    - by Jon
    Before I pose my question, I will give some insight as per my scenario: I work for a small business (cost is an important factor) Our bandwidth is limited and would not support an in-house FTP server We need to share files (mostly pdf, inDesign, Illustrator documents) to our clients, and as we expand, we are finding that our current locally-hosted FTP solution is too slow and is becoming a detriment to our sales team. What we need is a remotely hosted solution to share files with our clients, specifically with the following features: Greater than 100gb of secure storage The Ability to distribute unique log in credentials to clients, granting access to a personalized directory or folder, while limiting access to other files on the server. A relatively simple web-based UI for clients with limited computer knowledge We have considered a dedicated remote server, and web-based services (box.net, yousendit.com, onehub.com, filesanywhere.com) but I am unsure as per the direction we should be taking - have I left another solution out? What would you suggest? Thanks in advance.

    Read the article

  • How to download yahoo historical stock data into xls. format via matlab?

    - by Noob_1
    I have an xls sheet called Tickers (matrix 1 column 500 rows) with yahoo tickers. I want matlab to download the historical data for last 5 years for each stock ticker into a separate xls spreadsheet and save it in a given directory with title of the sheet = ticker. So that means i want a code that will create and save 500 tickers worth of data in 500 separate spreadhseets :) can anyone help or direct?

    Read the article

  • Make services not start automatically after reboot (as they require access to an encrypted partition)

    - by Binary255
    Hi, I use Ubuntu Server 10.04. I more or less only want the server to be accessible over SSH after a reboot. I will then login and mount the encrypted partition myself, after which I start the services which uses it. How would I go about setting something like that up? (My first idea was to have everything except /boot in an encrypted LVM, but I never got logging in through SSH and mounting the LVM to work. Initramfs was a bit too complicated for me. Otherwise I think this would have been the best solution.)

    Read the article

  • Windows Load Balancing Services and File Shares

    - by cbkadel
    We are using Windows Load Balancing Services (WLBS). One of the things that I do notice, is that if I create a File Share on one of the physical hosts, I am able browse to that file share using the clustered-ip address. This might be a 'opinion' question, but I haven't been able to find much literature on file shares in particular with wlbs. Is this a recommendation configuration? Are there any limitations? What about when the share contains different sets of content on both hosts? For instance: Three 'hostnames' - host1 (physical1), host2 (physical2), and cluster. I create the following shares: \physical1\myshare \physical2\myshare What I notice is that i can see: \cluster\myshare I'm guessing that this is read-only, and that there's no file synchronization. But what happens if they are in fact out of sync, what would a network browser see then? Thanks for your time!

    Read the article

  • Is there a way to visualize records stored in an iPhone app via Core Data?

    - by Justin Searls
    I have an app which, for good reasons, can only be debugged on a device. I'm using Core Data for the first time, and I'd like to be able to easily inspect the records that are stored by the app on the device. I imagine that Core Data is by default backed by SQLite on the iPhone, so this question might be as simple as asking: "What's the easiest way to extract the SQLite database for an app installed by Xcode without jailbreaking it?" Any experience someone could lend regarding this would be greatly appreciated.

    Read the article

  • Chapter 3: Data-Tier Applications

    With the release of Microsoft SQL Server 2008 R2, the SQL Server Manageability team addressed these struggles by introducing support for data-tier applications to help streamline the deployment, management, and upgrade of database applications. A data tier application, also referred to as a DAC, is a single unit of deployment that contains all the elements used by an application, such as the database application schema, instance level objects, associated database objects, files and scripts, and even a manifest defining the organization’s deployment requirements.

    Read the article

  • How should flushing be handled in a doctrine EntityManager instance shared across different services in symfony2?

    - by Jbm
    I have defined several services in symfony 2 which persist changes to the database. These services have the doctrine instance as one of their dependencies: a.given.service: class: Acme\TestBundle\Service\AGivenService arguments: [@doctrine] If I have two different services and both of them persist objects through the EntityManager, which is obtained like this from the doctrine instance: $em = $doctrine->getEntityManager(); Would all services always share the same EntityManager? If so, how should I handle flushing if I wanted to handle all the changes in a single transaction? I have checked this: http://docs.doctrine-project.org/projects/doctrine-orm/en/2.0.x/reference/transactions-and-concurrency.html and it explains how to handle different transactions in a request, but I want to achieve the opposite, which is having different changes in different services handled as a single transaction. Is there a better approach to handle multiple changes in different services? For now my best bet is having a front-end service in charge of calling the other services and doing the flushing afterwards. Backend services would persist objects but would not do any flushing.

    Read the article

  • Web map services displaying poorly

    - by user29261
    Web map services no longer display correctly in ArcGIS and Google Earth - no one else on network is experiencing these. Using windows 7 OS. These problem began abruptly (1 day they were working, the next they were not). Specific problems include not displaying at all; correct display on 10% of the monitor and repeating lines of coloured squiggles and text on the remaining 90%; and small, widely spaced, pixels in place of colour fill. Links where this occurs: arcgis http://wms.ess-ws.nrcan.gc.ca/wms/toporama_en Google earth http://openmaps.gov.bc.ca/kml/BCGov_Web_Map_Library.kml I can't pinpoint any changes to the computer setup which may have prompted this, however, adding arcgis 10.1 SP 1 occured around the same time. Probably just a coincidence. Anyone had similar problems? or solutions to these? Thanks for any thoughts. Jim

    Read the article

  • google analytics statistics

    - by colmcq
    I am compiling a report for a client using google analytics. I have observed that the client has unusually good page view times (5 mins) and excellent bounce rates (<25%). I need to reference research data that validates my assertion that these figures are excellent compared to an industry standard (the industry is ecommerce and gaming). Can you direct me to any published research data that specifies normal bounce rates and page view times for this industry?

    Read the article

  • Php, passing data between pages without using the url?

    - by terabytest
    I have a php page that has a form that asks for an e-mail. When you press the send button, it gets to another php page, which gets the form data and does its stuff. I need to then be able to go back to the old page (the one that contained the form) and give it some data so that it will be able to change itself and say "You've sent your e-mail successfully, and will not display the form. How do I do it?

    Read the article

  • Resolving Domainnames differently for different services

    - by mlaug
    Some time ago we had an issue with our network infrastructure and php with curl. Our Network infrastructure is fairly simple. LoadBalancer/Firewall = 5 servers The Domainname of our website is set to the ip of the Loadbalancer, of course. But calling curl from one of the servers did result in a timeout. It appears that a server could not call for its own domain it is serving. So we had to set the domains via /etc/hosts to the sever itself. But now We have implemented a Varnish in front of the Loadbalancer, which we want to automatically purge, once a change on a page happens. So now we need to call the domain www.example.com/url_to_purge. Sadly this call what be resolved to the server itself instead of the varnish, because of the /etc/hosts entries. So now I am wondering, if you could resolve domain names differently for different services :)

    Read the article

  • Encryption Primer for SQL Server Data

    As a database developer or DBA there is not a lot you can do about a legitimate user sharing confidential data. However, you can minimize the risks of someone breaking into our database and browsing around to find confidential information. This article explores how you can use SQL Server features to encrypt your confidential data.

    Read the article

  • How to display HTML-like table data on iPhone?

    - by Jason
    I have a set of data in a matrix which I would like to display in my iPhone app with all of the rows and columns intact. Everything I can find on the web dealing with "tables iPhone" gives me information on UITableView, which only lets you show a list of items to the user - not an actual table in the HTML sense. What's the best way on the iPhone to display an actual table of data to the user, with column & row headings and table cells?

    Read the article

  • Regaining access to Linux server after SSH service dies?

    - by GigaWatt
    I recently ran into an issue with a VPS where the SSH service crashed, leaving me unable to connect to the machine. The other services were up and running; only the SSH service died. I managed to resolve the situation with a reboot from the VPS control panel, but the incident got me thinking: Assuming: I don't have physical access to the machine I have no server control panel access or means of rebooting the server All other system services are still functioning Then how could I recover from the SSH service dying?

    Read the article

< Previous Page | 228 229 230 231 232 233 234 235 236 237 238 239  | Next Page >