Search Results

Search found 15877 results on 636 pages for 'reporting services'.

Page 19/636 | < Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >

  • Methods of providing custom reports for an asp.net application

    - by Jarrod
    In our windows application, we used crystal reports. If a customer needed a custom report, we could create it and then send them the rpt file. The customer would then simply add the report file to a custom folder, and could access it directly from our application. Using a web app, how is this possible. It seems like anything created, using ssrs, crystal, or even telerik reporting must be embedded. What are some methods for providing custom reports to users of an asp.net web application?

    Read the article

  • I need a backend system that is integrated with web services, is there an open source solution?

    - by Jarom
    I'm basically familiar with what I need in order to setup web services to talk with a centralized db, but if I don't have to go through and do all the work, I'd rather not. Is there an open source solution that would allow me to easily integrate web services for data transfer to a central db? I want to make a site that is powered by a db that can also be accessed by other things like mobile apps for example. What are the steps involved in setting up such a site? Any help is appreciated! I could use all the help I can get!

    Read the article

  • How to deploy SQL Server 2005 Reporting Services on a network without a domain server?

    - by ti
    I have a small Windows network (~30 machines) and I need to deploy SQL Server 2005 Reporting Services. Because I use SQL Server Standard Edition and not Enterprise, I am forced to use Windows Authentication to the users. I am a Linux admin, and have near zero knowledge on Active Directory. As deep as my shallow knowledge goes, I think that I would need to invest in a domain server, a mirrored backup of that domain server. I think that I need to change every computer to use this domain too, and if the domain server goes down, every computer will be unavailable. Is there a easier way to deploy Windows Authentication so that users can access Reporting Services from their computers without changing the infra-structure that much? Thanks!

    Read the article

  • Is it safe to remove Per user queued Windows Error Reporting?

    - by Rewinder
    I was cleaning up my laptop hard-disk, running Windows 7, and as part of the process I ran the Disk Cleanup utility. To my surprise I saw 2 items in the list that were quite large (both ~300MB). Per user queued Windows Error Reporting System queued Windows Error Reporting I guess I had never noticed these, because they were never that big. So, what are these items? Any particular reason why they became so large all of a sudden? And finally, is it safe to remove them?

    Read the article

  • WCF - Automatically create ServiceHost for multiple services

    - by Rajesh Pillai
    WCF - Automatically create ServiceHost for multiple services Welcome back readers!  This blog post is about a small tip that may make working with WCF servicehost a bit easier, if you have lots of services and you need to quickly host them for testing. Recently I was encountered a situation where we were faced to create multiple service host quickly for testing.  Here is the code snippet which is pretty self explanatory.  You can put this code in your service host which in this case is  a console application. class Program   {       static void Main(string[] args)       { // Stores all hosts           List<ServiceHost> hosts = new List<ServiceHost>();           try           { // Get the services element from the serviceModel element in the config file               var section = ConfigurationManager.GetSection("system.serviceModel/services") as ServicesSection;               if (section != null)               {                   foreach (ServiceElement element in section.Services)                   { // NOTE : If the assembly is in another namespace, provide a fully qualified name here in the form // <typename, namespace> // For e.g. Business.Services.CustomerService, Business.Services                       var serviceType = Type.GetType(element.Name); // Get the typeName                        var host = new ServiceHost(serviceType);                       hosts.Add(host); // Add to the host collection                       host.Open(); // Open the host                   }               }               Console.ReadLine();           }           catch (Exception e)           {               Console.WriteLine(e.Message);               Console.ReadLine();           }           finally           {               foreach (ServiceHost host in hosts)               {                   if (host.State == CommunicationState.Opened)                   {                       host.Close();                   }                   else                   {                       host.Abort();                   }               }           }       }   } I hope you find this useful.  You can make this as a windows service if required.

    Read the article

  • Ways of breaking down SQL transactional/call data into reports -- 'square data'?

    - by RizwanK
    I've got a large database of call-traffic information (although the question could be answered with any generic data set.) For instance, a row contains : call endpoint server (endpoint_name) call endpoint status (sip_disconnect_reason) call destination (destination) call completed (duration) [duration 0 is completed] call account group (account_group) It's pretty easy to run SQL reports against the data, i.e. select count(*), endpoint_name from calls where duration0 group by endpoint_name select count(*),destination from calls where blah group by destination I've been calling this filtering or breakdown reports (I get the number of calls per carrier, etc.). Add another breakdown, and you've got two breakdowns, a la select count(*), endpoint_name, sip_disconnect_reason from calls where duration=0 group by endpoint_name, sip_disconnect_reason Of course, if you keep adding breakdowns, you end up making super-large reports and slicing your data so thin that you can't extract any trends from it. So my question is this : Is there a name for this sort of method of report writing? (I've heard words like squares, slicing and breakdown reports applied to them) --- I'm looking for a Python/Reporting toolkit that I can use to make these easier to generate for my end users. aside : Are there other ways of representing transactional data that might be useful rather than the above method? Thanks,

    Read the article

  • How to deploy SQL Reporting 2005 when Data Sources are locked?

    - by spoulson
    The DBAs here maintain all SQL Server and SQL Reporting servers. I have a custom developed SQL Reporting 2005 project in Visual Studio that runs fine on my local SQL Database and Reporting instances. I need to deploy to a production server, so I had a folder created on a SQL Reporting 2005 server with permissions to upload files. Normally, a deploy from within Visual Studio is all that is needed to upload the report files. However, for security purposes, data sources are maintained explicitly by DBAs and stored in a separated locked down common folder on the reporting server. I had them create the data source for me. When I attempt to deploy from VS, it gives me the error "The item '/Data Sources' already exists." I get this whether I'm deploying the whole project or just a single report file. I already set OverwriteDataSources=false in the project properties. The TargetServer URL and folder are verified correct. I suppose I could copy the files manually, but I'd like to be able to deploy from within VS. What could I be doing wrong?

    Read the article

  • WCF RIA Services DomainContext Abstraction Strategies–Say That 10 Times!

    - by dwahlin
    The DomainContext available with WCF RIA Services provides a lot of functionality that can help track object state and handle making calls from a Silverlight client to a DomainService. One of the questions I get quite often in our Silverlight training classes (and see often in various forums and other areas) is how the DomainContext can be abstracted out of ViewModel classes when using the MVVM pattern in Silverlight applications. It’s not something that’s super obvious at first especially if you don’t work with delegates a lot, but it can definitely be done. There are various techniques and strategies that can be used but I thought I’d share some of the core techniques I find useful. To start, let’s assume you have the following ViewModel class (this is from my Silverlight Firestarter talk available to watch online here if you’re interested in getting started with WCF RIA Services): public class AdminViewModel : ViewModelBase { BookClubContext _Context = new BookClubContext(); public AdminViewModel() { if (!DesignerProperties.IsInDesignTool) { LoadBooks(); } } private void LoadBooks() { _Context.Load(_Context.GetBooksQuery(), LoadBooksCallback, null); } private void LoadBooksCallback(LoadOperation<Book> books) { Books = new ObservableCollection<Book>(books.Entities); } } Notice that BookClubContext is being used directly in the ViewModel class. There’s nothing wrong with that of course, but if other ViewModel objects need to load books then code would be duplicated across classes. Plus, the ViewModel has direct knowledge of how to load data and I like to make it more loosely-coupled. To do this I create what I call a “Service Agent” class. This class is responsible for getting data from the DomainService and returning it to a ViewModel. It only knows how to get and return data but doesn’t know how data should be stored and isn’t used with data binding operations. An example of a simple ServiceAgent class is shown next. Notice that I’m using the Action<T> delegate to handle callbacks from the ServiceAgent to the ViewModel object. Because LoadBooks accepts an Action<ObservableCollection<Book>>, the callback method in the ViewModel must accept ObservableCollection<Book> as a parameter. The callback is initiated by calling the Invoke method exposed by Action<T>: public class ServiceAgent { BookClubContext _Context = new BookClubContext(); public void LoadBooks(Action<ObservableCollection<Book>> callback) { _Context.Load(_Context.GetBooksQuery(), LoadBooksCallback, callback); } public void LoadBooksCallback(LoadOperation<Book> lo) { //Check for errors of course...keeping this brief var books = new ObservableCollection<Book>(lo.Entities); var action = (Action<ObservableCollection<Book>>)lo.UserState; action.Invoke(books); } } This can be simplified by taking advantage of lambda expressions. Notice that in the following code I don’t have a separate callback method and don’t have to worry about passing any user state or casting any user state (the user state is the 3rd parameter in the _Context.Load method call shown above). public class ServiceAgent { BookClubContext _Context = new BookClubContext(); public void LoadBooks(Action<ObservableCollection<Book>> callback) { _Context.Load(_Context.GetBooksQuery(), (lo) => { var books = new ObservableCollection<Book>(lo.Entities); callback.Invoke(books); }, null); } } A ViewModel class can then call into the ServiceAgent to retrieve books yet never know anything about the DomainContext object or even know how data is loaded behind the scenes: public class AdminViewModel : ViewModelBase { ServiceAgent _ServiceAgent = new ServiceAgent(); public AdminViewModel() { if (!DesignerProperties.IsInDesignTool) { LoadBooks(); } } private void LoadBooks() { _ServiceAgent.LoadBooks(LoadBooksCallback); } private void LoadBooksCallback(ObservableCollection<Book> books) { Books = books } } You could also handle the LoadBooksCallback method using a lambda if you wanted to minimize code just like I did earlier with the LoadBooks method in the ServiceAgent class.  If you’re into Dependency Injection (DI), you could create an interface for the ServiceAgent type, reference it in the ViewModel and then inject in the object to use at runtime. There are certainly other techniques and strategies that can be used, but the code shown here provides an introductory look at the topic that should help get you started abstracting the DomainContext out of your ViewModel classes when using WCF RIA Services in Silverlight applications.

    Read the article

  • Uninstalling Reporting Server 2008 on Windows Server 2008

    - by Piotr Rodak
    Ha. I had quite disputable pleasure of installing and reinstalling and reinstalling and reinstalling – I think about 5 times before it worked – Reporting Server 2008 on Windows Server with the same year number in name. During my struggle I came across an error which seems to be not quite unfamiliar to some more unfortunate developers and admins who happen to uninstall SSRS 2008 from the server. I had the SSRS 2008 installed as named instance, SQL2008. I wanted to uninstall the server and install it to default instance. And this is when it bit me – not the first time and not the last that day . The setup complained that it couldn’t access a DLL: Error message: TITLE: Microsoft SQL Server 2008 Setup ------------------------------ The following error has occurred: Access to the path 'C:\Windows\SysWOW64\perf-ReportServer$SQL2008-rsctr.dll' is denied. For help, click: http://go.microsoft.com/fwlink?LinkID=20476&ProdName=Microsoft+SQL+Server&EvtSrc=setup.rll&EvtID=50000&ProdVer=10.0.1600.22&EvtType=0x60797DC7%25400x84E8D3C0 ------------------------------ BUTTONS: OK This is a screenshot that shows the above error: This issue seems to have a bit of literature dedicated to it and even seemingly a KB article http://support.microsoft.com/kb/956173 and a similar Connect item: http://connect.microsoft.com/SQLServer/feedback/details/363653/error-messages-when-upgrading-from-sql-2008-rc0-to-rtm The article describes issue as following: When you try to uninstall Microsoft SQL Server 2008 Reporting Services from the server, you may receive the following error message: An error has occurred: Access to the path 'Drive_Letter:\WINDOWS\system32\perf-ReportServer-rsctr.dll' is denied. Note Drive_Letter refers to the disc drive into which the SQL Server installation media is inserted. In my case, the Note was not true; the error pointed to a dll that was located in Windows folder on C:\, not where the installation media were. Despite this difference I tried to identify any processes that might be keeping lock on the dll. I downloaded Sysinternals process explorer and ran it to find any processes I could stop. Unfortunately, there was no such process. I tried to rerun the installation, but it failed at the same step. Eventually I decided to remove the dll before the setup was executed. I changed name of the dll to be able to restore it in case of some issues. Interestingly, Windows let me do it, which means that indeed, it was not locked by any process. I ran the setup and this time it uninstalled the instance without any problems:   To summarize my experience I should say – be very careful, don’t leave any leftovers after uninstallation – remove/rename any folders that are left after setup has finished. For some reason, setup doesn’t remove folders and certain files. Installation on Windows Server 2008 requires more attention than on Windows 2003 because of the changed security model, some actions can be executed only by administrator in elevated execution mode. In general, you have to get used to UAC and a bit different experience than with Windows Server 2003. Technorati Tags: SQL Server 2008,Windows Server 2008,SRS,Reporting Services

    Read the article

  • Using Telerik Reporting in a WPF application

    Now that Telerik Reporting provides WPF support, let's see how to use it (a video is also available on Getting Started with the WPF viewer): Creating the application Install RadControls for WPF 2010 Q1 SP1 (download | release notes). Install the corresponding Telerik Reporting version. Create a new WPF application project in Visual Studio Add references to the following Telerik RadControls for WPF assemblies: Telerik.Windows.Controls Telerik.Windows.Controls.Input Telerik.Windows.Controls.Navigation Telerik.Windows.Data NOTE: It is possible that the RadControls for WPF assemblies have a greater version than the one against which the WPF Report Viewer control was built. In this case you have to add appropriate assembly binding redirects (see Binding Redirects bellow). Drag and drop the ReportViewer control from the toolbox in the WPF window. If the ReportViewer is not available in the toolbox, you can add it using the instructions from the How to add the WPF ...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Working with Reporting Services Filters – Part 3: The TOP and BOTTOM Operators

    - by smisner
    Thus far in this series, I have described using the IN operator and the LIKE operator. Today, I’ll continue the series by reviewing the TOP and BOTTOM operators. Today, I happened to be working on an example of using the TOP N operator and was not successful on my first try because the behavior is just a bit different than we find when using an “equals” comparison as I described in my first post in this series. In my example, I wanted to display a list of the top 5 resellers in the United States for AdventureWorks, but I wanted it based on a filter. I started with a hard-coded filter like this: Expression Data Type Operator Value [ResellerSalesAmount] Float Top N 5 And received the following error: A filter value in the filter for tablix 'Tablix1' specifies a data type that is not supported by the 'TopN' operator. Verify that the data type for each filter value is Integer. Well, that puzzled me. Did I really have to convert ResellerSalesAmount to an integer to use the Top N operator? Just for kicks, I switched to the Top % operator like this: Expression Data Type Operator Value [ResellerSalesAmount] Float Top % 50 This time, I got exactly the results I expected – I had a total of 10 records in my dataset results, so 50% of that should yield 5 rows in my tablix. So thinking about the problem with Top N some  more, I switched the Value to an expression, like this: Expression Data Type Operator Value [ResellerSalesAmount] Float Top N =5 And it worked! So the value for Top N or Top % must reflect a number to plug into the calculation, such as Top 5 or Top 50%, and the expression is the basis for determining what’s in that group. In other words, Reporting Services will sort the rows by the expression – ResellerSalesAmount in this case – in descending order, and then filter out everything except the topmost rows based on the operator you specify. The curious thing is that, if you’re going to hard-code the value, you must enter the value for Top N with an equal sign in front of the integer, but you can omit the equal sign when entering a hard-coded value for Top %. This experience is why working with Reporting Services filters is not always intuitive! When you use a report parameter to set the value, you won’t have this problem. Just be sure that the data type of the report parameter is set to Integer. Jessica Moss has an example of using a Top N filter in a tablix which you can view here. Working with Bottom N and Bottom % works similarly. You just provide a number for N or for the percentage and Reporting Services works from the bottom up to determine which rows are kept and which are excluded.

    Read the article

  • The Case of the Missing Date/Time Stamp: Reporting Services 2008 R2 Snapshots

    - by smisner
    This week I stumbled upon an undocumented “feature” in SQL Server 2008 R2 Reporting Services as I was preparing a demonstration on how to set up and use report snapshots. If you’re familiar with the main changes in this latest release of Reporting Services, you probably already know that Report Manager got a facelift this time around. Although this facelift was generally a good thing, one of the casualties – in my opinion – is the loss of the snapshot label that served two purposes… First, it flagged the report as a snapshot. Second, it let you know when that snapshot was created. As part of my standard operating procedure when demonstrating report snapshots, I point out this label, so I was rather taken aback when I didn’t see it in the demonstration I was preparing. It sort of upset my routine, and I’m rather partial to my routines. I thought perhaps I wasn’t looking in the right place and changed Report Manager from Tile View to Detail View, but no – that label was still missing. In the grand scheme of life, it’s not an earth-shattering change, but you’ll have to look at the Modified Date in Details View to know when the snapshot was run. Or hope that the report developer included a textbox to show the execution time in the report. (Hint: this is a good time to add this to your list of report development best practices, whether a report gets set up as a report snapshot or not!) A snapshot from the past In case you don’t remember how a snapshot appeared in Report Manager back in the old days (of SQL Server 2008 and earlier), here’s an image I snagged from my Reporting Services 2008 Step by Step manuscript: A snapshot in the present A report server running in SharePoint integrated mode had no such label. There you had to rely on the Report Modified date-time stamp to know the snapshot execution time. So I guess all platforms are now consistent. Here’s a screenshot of Report Manager in the 2008 R2 version. One of these is a snapshot and the rest execute on demand. Can you tell which is the snapshot? Consider descriptions as an alternative So my report snapshot demonstration has one less step, and I’ll need to edit the Denali version of the Step by Step book. Things are simpler this way, but I sure wish we had an easier way to identify the execution methods of the reports. Consider using the description field to alert users that the report is a snapshot. It might save you a few questions about why the data isn’t up-to-date if the users know that something changed in the source of the report. Notice that the full description doesn’t display in Tile View, so keep it short and sweet or instruct users to open Details View to see the entire description.

    Read the article

  • Microsoft Dynamics GP 2010 sortira le 1er Mai, amélioré en interopérabilité et en reporting

    Mise à jour du 24.04.2010 par Katleen Microsoft Dynamics GP 2010 sortira le 1er Mai, amélioré en interopérabilité et en reporting Microsoft Dynamics GP sera (enfin) très bientôt mis à jour, c'est une question de jours. Le premier Mai, le logiciel sera gratifié de plusieurs améliorations. Microsoft Dynamics GP 2010 comprendra de nouvelles fonctionnalités de reporting en business intelligence et sera mieux intégré grâce à un fonctionnement plus étroit avec Microsoft Dynamics CRM et Office Unified Communications, dans une logique d'amélioration de l'interopérabilité avec les autres produits Microsoft. Après trois ans d'attente, la patience des professionnels sera donc récompensée.

    Read the article

  • Lack of Transparency in the Supply Chain Results in Inconsistent Reporting on Conflict Minerals

    - by Terri Hiskey
    May 31, 2014 was the official deadline for U.S.-listed companies to disclose use of conflict minerals to the SEC. Of the estimated 6,000 companies that were required to file audits of their tin, gold, tungsten or tantalum in their products, only 1,300 filed reports, and these results have revealed the ongoing challenges that many manufacturers are having complying with this legislation. An article authored by IDC analyst Heather Ashton,"Conflict Minerals Reporting Passes a Notable Milestone" notes that many leading companies such as Intel, Apple and HP filed their reports ahead of the deadline, but other companies are struggling with trying to trace their supply chain back to raw materials, especially as many non-U.S. based suppliers have no legal requirement to comply with the law since they are not U.S.-listed companies. This has resulted in widely varying levels of reporting from company to company. Check out the full article here. Are your customers experiencing the same pains?

    Read the article

  • Oracle's Shared Services Model can Bring you Tremendous Savings

    Mike Gagas, Senior Director, Oracle On Demand Marketing explains how companies consolidate and streamline back office business processes with a shared services model. Numerous companies are achieving operational excellence using Oracle products and services to successfully deploy shared services. Oracle itself has saved over 2 billion dollars with shared services.

    Read the article

  • Make logwatch reports more interesting?

    - by Alexander Shcheblikin
    Is it possible to improve the quality of reports from logwatch? Like make it not just report disk usage which doesn't even change much in daily operation, but report significant changes in usage or approaching critical capacity levels? If I cannot do that with logwatch and instead have to write custom scripts to produce such reports, logwatch appears to be pretty useless, or even dangerous, as many users reportedly grow to ignore emails from it knowing they are so boring.

    Read the article

  • eSeminar ISV Partner Update: High Quality Reporting for Your Applications

    - by Mike.Hallett(at)Oracle-BI&EPM
    Play eSeminar Duration: 18 Minutes         Description: This webinar presents to ISV Partners Oracle’s latest release of BI Publisher, and describes how this tool can make their applications more competitive and appealing to their customers by providing High Quality Reporting and Business Intelligence embedded into their solution. • BI Publisher can Provide All Reports… at Lower Cost • Easier, with Better Developer Productivity • Better Managed : Better Performance, Less Administration • Highest Quality : Pixel Perfect and Interactive Reporting. Play eSeminar (Only accessible to Oracle Partners).

    Read the article

  • How to Obtain the Best SEO Services

    The present IT field is full of companies that boast of offering best SEO services at an affordable rate. Some even notify that they provide cost effective SEO services without compromising on the quality. Though a majority of the websites try to negotiate on the price, the services are still worth to obtain by paying the quoted price. The aspect of choosing SEO services that provide best quality is an uphill task. One of the factors that favour this aspect is thorough research on the Internet.

    Read the article

  • DomainContext class is not created in a RIA services class library solution

    - by Konamiman
    I have created a RIA services class library project and things are not going as I expected. The problem is that when I add domain service classes to the server project, the corresponding domain context classes are not generated on the client project. I start by creating a new project of type WCF RIA services class library. The generated solution has two projects: RIAServicesLibrary1 (the Silverlight class library project) and RIAServicesLibrary1.Web (the class library that will hold the services). Then I add a new project item to RIAServicesLibrary1.Web of type DomainServiceClass. I add a sample method so that the resulting class code is: namespace RIAServicesLibrary1.Web { using System; using System.Collections.Generic; using System.ComponentModel; using System.ComponentModel.DataAnnotations; using System.Linq; using System.ServiceModel.DomainServices.Hosting; using System.ServiceModel.DomainServices.Server; // TODO: Create methods containing your application logic. [EnableClientAccess()] public class DomainService1 : DomainService { [Invoke] void DoSomething() { } } } Then I generate the whole solution and... nothing happens on the client project. The Generated_Code folder is empty, and the domain context object for the service is not here. Funny enough, if I add a new item of type Authentication domain service, it works as expected: the file RIAServicesLibrary1.Web.g.cs is created on the client project, containing the AuthenticationDomainService1 class as expected. So what's happening here? Am I doing something wrong? Note: I am using Visual Studio 2010 Ultimate RTM and WCF RIA services 1.0.

    Read the article

  • Data of the web services in a RSS

    - by vymz
    I only have the .wsdl and I want to put the data that the web services return in the RSS. And in a web page of SAP, I can only upload RSS, so I need put the information of the web services in the RSS For example, I put the information manually(name and total value) in the fields <title> and <descripcion>, these data are extracted from the web services. But sometimes I don´t know how much information brings the web service. Also, I know that RSS is not to store information such as web services. <?xml version="1.0" encoding="UTF-8" ?> <rss version="2.0"> <channel> <title>Test RSS</title> <link>http://solutions.com</link> <description>RSS</description> <item> <title>Luiz</title> <link>http://www.solutions.com/prueba1</link> <description>10</description> </item> <item> <title>Clodoaldo</title> <link>http://www.solutions.com/prueba2</link> <description>5</description> </item>

    Read the article

  • Restful Java based web services in json + html5 and javascript no templates (jsp/jsf/freemarker) aka fat/thick client

    - by Ismail Marmoush
    I have this idea of building a website which service JSON data through restful services framework. And will not use any template engines like jsp/jsf/freemarker. Just pure html5 and Javascript libs. What do you think of the pros and cons of such design ? Just for elaboration and brain storming a friend of mine argued with the following concerns: sounds like gwt this way you won't have any control over you service api for example say you wanna charge the user per request how will you handle it? how will you control your design and themes? what about the 1st request the browser make? not easy with this all of the user's requests will come with "Accept" header "application/json" how will you separate browser from abuser? this way all of your public apis will be used by third party apps abusively and you won't be able to lock it since you won't be able to block the normal user browser We won't use compiled html anyway but may be something like freemarker and in that case you won't expose any of your json resources to the unauthorized user but you will expose all the html since any browser can access them all the well known 1st class services do this can you send me links to what you've read? keep in mind the DOM based XSS it will be a nightmare ofc, if what you say is applicable.

    Read the article

  • Should Starting a Quick Game via Google Game Services be Iterated?

    - by user46727
    I have been following this tutorial for Google Play Game Services. I am a little unclear as to if the room matching algorithm should be looped or not. Can I just initialize this process once and let it time out? Or by iterating through it is it somehow rechecking it? If anyone had the approximate timeout that would be great as well. The problem stems from the fact that even when both phones are signing into the Game Services (at virtually the same time, my friend and I logged in), the room is not registering multiple people. One time my friend's phone even entered the game map, showing that he somehow was able to progress from the room initialization process. Relevant screen update methods which I am starting this matchmaking process: @Override public void update(float deltaTime) { game.options.updateTiles(); if(!isInitiated) { startQuickGame(); } } private void startQuickGame() { // auto-match criteria to invite one random automatch opponent. // You can also specify more opponents (up to 3). if(game.mGoogleClient.isConnected() && !isInitiated) { Bundle am = RoomConfig.createAutoMatchCriteria(1, 3, 0); // build the room config: RoomConfig.Builder roomConfigBuilder = RoomConfig.builder(Network.getInstance()); roomConfigBuilder.setMessageReceivedListener(Network.getInstance()); roomConfigBuilder.setRoomStatusUpdateListener(Network.getInstance()); roomConfigBuilder.setAutoMatchCriteria(am); RoomConfig roomConfig = roomConfigBuilder.build(); // create room: Games.RealTimeMultiplayer.create(game.mGoogleClient, roomConfig); // go to game screen this.mRoom = Network.getInstance().getRoom(); if(this.mRoom != null && this.mRoom.getParticipants().size() >= 2) { game.setScreen(new MultiGameScreen(game, this.mRoom)); isInitiated = true; } } else { game.mGoogleClient.connect(); } }

    Read the article

  • Deployment Error: Silverlight 4.0 w/WCF RIA Services in ASP.NET MVC 2 App

    - by Dennis Ward
    I've got an MVC 2 App with an RIA Services link to a Silverlight Application. On my local machine, all is well, but when I deploy to Discount ASP servers, neither the MVC controller nor the WCF RIA services called from silverlight function: A silverlight datagrid gets a load error: System.ServiceModel.DomainServices.Client.DomainOperationException: Load operation failed for query... The remote server returned an error NotFound. In the MVC page where I had a simple table that worked prior to adding an EF model and DomainDataSource, I now get the error: Unable to load one or more of the requested types. Retrieve the LoaderExceptions property for more information. This is very similar to an issue I had before, but after upgrading from the betas of WCF/Silverlight 4, but the fix I had added there doesn't seem to work any longer. The link for that issue is: SL4/MVC2/WCF RIA Services = Load Error I'm really struggling with deploying, and could use some help if anybody can shed any light on this. Thanks! Dennis

    Read the article

  • WCF Data Services implementation strategies.

    - by Nix
    Microsoft has done a savvy job of not outlining the actual place for data services in the wonderful world of SOA/Web dev. So my question is simple, are WCF Data Services designed to be used via clients? Or has anyone ever heard of someone using them on the server side? Simple scenario a general layered architecture using BO business objects (parenthesis indicate what is being passed between layers) (XML) WCF Service - (BO)Business Logic - (BO) Dao - Entity Framework or using data services it would be where DS BO are modeled business entities to be used in data service. (XML) WCF Service -(BO) Business Logic - (BO) WCF Data Service - (DS BO)Server I can't see a use for the later, unless there are going to be a lot of cases people would be accessing your data via your Data Service Layer vs the Service layer? Thoughts anyone? I have not seen any mention of using DS from within a Service Layer....

    Read the article

< Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >