Search Results

Search found 16252 results on 651 pages for 'entity framework 5'.

Page 15/651 | < Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >

  • Entity/Component based engine rendering separation from logic

    - by Denis Narushevich
    I noticed in Unity3D that each gameObject(entity) have its own renderer component, as far I understand, such component handle rendering logic. I wonder if it is a common practice in entity/component based engines, when single entity have renderer components and logic components such as position, behavior altogether in one box? Such approach sound odd to me, in my understanding entity itself belongs to logic part and shouldn't contain any render specific things inside. With such approach it is impossible to swap renderers, it would require to rewrite all that customized renderers. The way I would do it is, that entity would contain only logic specific components, like AI,transform,scripts plus reference to mesh, or sprite. Then some entity with Camera component would store all references to object that is visible to the camera. And in order to render all that stuff I would have to pass Camera reference to Renderer class and render all sprites,meshes of visible entities. Is such approach somehow wrong?

    Read the article

  • Repair .NET Framework on Windows 2008 R2

    - by Niels R.
    One of our web servers has become inoperable and after some searching we think the .NET Framework might be corrupted in some way. The server runs Windows 2008 R2 and uses the 2.0 framework for the ASP.NET application that is (or better: was) running using IIS 7.5. I'm wondering how we can reinstall the .NET 2.0 Framework on Windows 2008 R2. Any ideas? Kind regards, Niels R.

    Read the article

  • Repair .NET Framework on Windows 2008 R2

    - by Niels R.
    One of our web servers has become inoperable and after some searching we think the .NET Framework might be corrupted in some way. The server runs Windows 2008 R2 and uses the 2.0 framework for the ASP.NET application that is (or better: was) running using IIS 7.5. I'm wondering how we can reinstall the .NET 2.0 Framework on Windows 2008 R2. Any ideas? Kind regards, Niels R.

    Read the article

  • Open source report framework

    - by Tiax
    I'm looking for a open source report framework for statistics. A more detailed explanation is: We have a number of tests running on different servers collecting data (for example, login time) every 5min. What we need is a framework that collects this data (or exposes web services for us to push the data into the framework) and presents it in form of graphs and so on. Does anyone know of a framework that's easy to use out of the box but has the power to grow? If you know what I mean. Thanks in advance!

    Read the article

  • Creating Entity Framework objects with Unity for Unit of Work/Repository pattern

    - by TobyEvans
    Hi there, I'm trying to implement the Unit of Work/Repository pattern, as described here: http://blogs.msdn.com/adonet/archive/2009/06/16/using-repository-and-unit-of-work-patterns-with-entity-framework-4-0.aspx This requires each Repository to accept an IUnitOfWork implementation, eg an EF datacontext extended with a partial class to add an IUnitOfWork interface. I'm actually using .net 3.5, not 4.0. My basic Data Access constructor looks like this: public DataAccessLayer(IUnitOfWork unitOfWork, IRealtimeRepository realTimeRepository) { this.unitOfWork = unitOfWork; this.realTimeRepository = realTimeRepository; } So far, so good. What I'm trying to do is add Dependency Injection using the Unity Framework. Getting the EF data context to be created with Unity was an adventure, as it had trouble resolving the constructor - what I did in the end was to create another constructor in my partial class with a new overloaded constructor, and marked that with [InjectionConstructor] [InjectionConstructor] public communergyEntities(string connectionString, string containerName) :this() { (I know I need to pass the connection string to the base object, that can wait until once I've got all the objects initialising correctly) So, using this technique, I can happily resolve my entity framework object as an IUnitOfWork instance thus: using (IUnityContainer container = new UnityContainer()) { container.RegisterType<IUnitOfWork, communergyEntities>(); container.Configure<InjectedMembers>() .ConfigureInjectionFor<communergyEntities>( new InjectionConstructor("a", "b")) DataAccessLayer target = container.Resolve<DataAccessLayer>(); Great. What I need to do now is create the reference to the repository object for the DataAccessLayer - the DAL only needs to know the interface, so I'm guessing that I need to instantiate it as part of the Unity Resolve statement, passing it the appropriate IUnitOfWork interface. In the past, I would have just passed the Repository constructor the db connection string, and it would have gone away, created a local Entity Framework object and used that just for the lifetime of the Repository method. This is different, in that I create an Entity Framework instance as an IUnitOfWork implementation during the Unity Resolve statement, and it's that instance I need to pass into the constructor of the Repository - is that possible, and if so, how? I'm wondering if I could make the Repository a property and mark it as a Dependency, but that still wouldn't solve the problem of how to create the Repository with the IUnitOfWork object that the DAL is being Resolved with I'm not sure if I've understood this pattern correctly, and will happily take advice on the best way to implement it - Entity Framework is staying, but Unity can be swapped out if not the best approach. If I've got the whole thing upside down, please tell me thanks

    Read the article

  • ADO.NET Entity Framework with OLE DB Access Data Source

    - by Tim Long
    Has anyone found a way to make the ADO.NET Entity Framework work with OLE DB or ODBC data sources? Specifically, I need to work with an Access database that for various reasons can't be upsized to SQL. This MSDN page says: The .NET Framework includes ADO.NET providers for direct access to Microsoft SQL Server (including Entity Framework support), and for indirect access to other databases with ODBC and OLE DB drivers (see .NET Framework Data Providers). For direct access to other databases, many third-party providers are available as shown below. The reference to "indirect access to other databases" is tantalising but I confess that I am hopelessly confused by all the different names for data access technology.

    Read the article

  • Best practices for using the Entity Framework with WPF DataBinding

    - by Ken Smith
    I'm in the process of building my first real WPF application (i.e., the first intended to be used by someone besides me), and I'm still wrapping my head around the best way to do things in WPF. It's a fairly simple data access application using the still-fairly-new Entity Framework, but I haven't been able to find a lot of guidance online for the best way to use these two technologies (WPF and EF) together. So I thought I'd toss out how I'm approaching it, and see if anyone has any better suggestions. I'm using the Entity Framework with SQL Server 2008. The EF strikes me as both much more complicated than it needs to be, and not yet mature, but Linq-to-SQL is apparently dead, so I might as well use the technology that MS seems to be focusing on. This is a simple application, so I haven't (yet) seen fit to build a separate data layer around it. When I want to get at data, I use fairly simple Linq-to-Entity queries, usually straight from my code-behind, e.g.: var families = from family in entities.Family.Include("Person") orderby family.PrimaryLastName, family.Tag select family; Linq-to-Entity queries return an IOrderedQueryable result, which doesn't automatically reflect changes in the underlying data, e.g., if I add a new record via code to the entity data model, the existence of this new record is not automatically reflected in the various controls referencing the Linq query. Consequently, I'm throwing the results of these queries into an ObservableCollection, to capture underlying data changes: familyOC = new ObservableCollection<Family>(families.ToList()); I then map the ObservableCollection to a CollectionViewSource, so that I can get filtering, sorting, etc., without having to return to the database. familyCVS.Source = familyOC; familyCVS.View.Filter = new Predicate<object>(ApplyFamilyFilter); familyCVS.View.SortDescriptions.Add(new System.ComponentModel.SortDescription("PrimaryLastName", System.ComponentModel.ListSortDirection.Ascending)); familyCVS.View.SortDescriptions.Add(new System.ComponentModel.SortDescription("Tag", System.ComponentModel.ListSortDirection.Ascending)); I then bind the various controls and what-not to that CollectionViewSource: <ListBox DockPanel.Dock="Bottom" Margin="5,5,5,5" Name="familyList" ItemsSource="{Binding Source={StaticResource familyCVS}, Path=., Mode=TwoWay}" IsSynchronizedWithCurrentItem="True" ItemTemplate="{StaticResource familyTemplate}" SelectionChanged="familyList_SelectionChanged" /> When I need to add or delete records/objects, I manually do so from both the entity data model, and the ObservableCollection: private void DeletePerson(Person person) { entities.DeleteObject(person); entities.SaveChanges(); personOC.Remove(person); } I'm generally using StackPanel and DockPanel controls to position elements. Sometimes I'll use a Grid, but it seems hard to maintain: if you want to add a new row to the top of your grid, you have to touch every control directly hosted by the grid to tell it to use a new line. Uggh. (Microsoft has never really seemed to get the DRY concept.) I almost never use the VS WPF designer to add, modify or position controls. The WPF designer that comes with VS is sort of vaguely helpful to see what your form is going to look like, but even then, well, not really, especially if you're using data templates that aren't binding to data that's available at design time. If I need to edit my XAML, I take it like a man and do it manually. Most of my real code is in C# rather than XAML. As I've mentioned elsewhere, entirely aside from the fact that I'm not yet used to "thinking" in it, XAML strikes me as a clunky, ugly language, that also happens to come with poor designer and intellisense support, and that can't be debugged. Uggh. Consequently, whenever I can see clearly how to do something in C# code-behind that I can't easily see how to do in XAML, I do it in C#, with no apologies. There's been plenty written about how it's a good practice to almost never use code-behind in WPF page (say, for event-handling), but so far at least, that makes no sense to me whatsoever. Why should I do something in an ugly, clunky language with god-awful syntax, an astonishingly bad editor, and virtually no type safety, when I can use a nice, clean language like C# that has a world-class editor, near-perfect intellisense, and unparalleled type safety? So that's where I'm at. Any suggestions? Am I missing any big parts of this? Anything that I should really think about doing differently?

    Read the article

  • Does Sandcastle support Entity Framework Partial Classes?

    - by ChrisHDog
    I am attempting to use Sandcastle (and Sandcastle Help File Builder) to do some "auto-documentation" of some classes I am using. The classes that are giving me trouble are some partial classes on Entity Framework items that add methods and properties to those Framework items. The triple slash comments don't appear to come through on the methods and properties created in the partial classes. I have out how to get xml documentation of the base properties using the short summary and long description fields on the .emdx editor, but that doesn't provide a solution for the items in the partial classes. Is this possible? Is it perhaps just settings that I'm not setting correctly to pick up the partial classes? Does Sandcastle do partial classes in non-Entity Framework settings? Is what I'm doing even possible (has anyone else successfully used the xml created from triple slash comments to create documentation on entity framework partial classes, and if so how did you do that)? Any assistance is appreciated

    Read the article

  • Where to start .NET Entity Framework and ORM?

    - by Freshblood
    Hello I haven't used any database system enough but i believe i know logic of databases and i have learnt little sql so i shouldn't start to learn ORM before learn them well? Where can i start to learn .NET Entity Framework and which version of framework i have to start 3.5 or 4.0 because i heard that 4.0 has strong support for Entity Framework.I am looking sources web pages,e-books or other else.

    Read the article

  • ADO.NET Data Services Entity Framework request error when property setter is internal

    - by Jim Straatman
    I receive an error message when exposing an ADO.NET Data Service using an Entity Framework data model that contains an entity (called "Case") with an internal setter on a property. If I modify the setter to be public (using the entity designer), the data services works fine. I don’t need the entity "Case" exposed in the data service, so I tried to limit which entities are exposed using SetEntitySetAccessRule. This didn’t work, and service end point fails with the same error. public static void InitializeService(IDataServiceConfiguration config) { config.SetEntitySetAccessRule("User", EntitySetRights.AllRead); } The error message is reported in a browser when the .svc endpoint is called. It is very generic, and reads “Request Error. The server encountered an error processing the request. See server logs for more details.” Unfortunately, there are no entries in the System and Application event logs. I found this stackoverflow question that shows how to configure tracing on the service. After doing so, the following NullReferenceExceptoin error was reported in the trace log. Does anyone know how to avoid this exception when including an entity with an internal setter? Blockquote 131076 3 0 2 MOTOJIM http://msdn.microsoft.com/en-US/library/System.ServiceModel.Diagnostics.TraceHandledException.aspx Handling an exception. 685a2910-19-128703978432492675 System.NullReferenceException, mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089 Object reference not set to an instance of an object. at System.Data.Services.Providers.ObjectContextServiceProvider.PopulateMemberMetadata(ResourceType resourceType, MetadataWorkspace workspace, IDictionary2 entitySets, IDictionary2 knownTypes) at System.Data.Services.Providers.ObjectContextServiceProvider.PopulateMetadata(IDictionary2 knownTypes, IDictionary2 entitySets) at System.Data.Services.Providers.BaseServiceProvider.PopulateMetadata() at System.Data.Services.DataService1.CreateProvider(Type dataServiceType, Object dataSourceInstance, DataServiceConfiguration&amp; configuration) at System.Data.Services.DataService1.EnsureProviderAndConfigForRequest() at System.Data.Services.DataService1.ProcessRequestForMessage(Stream messageBody) at SyncInvokeProcessRequestForMessage(Object , Object[] , Object[] ) at System.ServiceModel.Dispatcher.SyncMethodInvoker.Invoke(Object instance, Object[] inputs, Object[]&amp; outputs) at System.ServiceModel.Dispatcher.DispatchOperationRuntime.InvokeBegin(MessageRpc&amp; rpc) at System.ServiceModel.Dispatcher.ImmutableDispatchRuntime.ProcessMessage5(MessageRpc&amp; rpc) at System.ServiceModel.Dispatcher.ImmutableDispatchRuntime.ProcessMessage4(MessageRpc&amp; rpc) at System.ServiceModel.Dispatcher.ImmutableDispatchRuntime.ProcessMessage3(MessageRpc&amp; rpc) at System.ServiceModel.Dispatcher.ImmutableDispatchRuntime.ProcessMessage2(MessageRpc&amp; rpc) at System.ServiceModel.Dispatcher.ImmutableDispatchRuntime.ProcessMessage1(MessageRpc&amp; rpc) at System.ServiceModel.Dispatcher.MessageRpc.Process(Boolean isOperationContextSet) </StackTrace> <ExceptionString>System.NullReferenceException: Object reference not set to an instance of an object. at System.Data.Services.Providers.ObjectContextServiceProvider.PopulateMemberMetadata(ResourceType resourceType, MetadataWorkspace workspace, IDictionary2 entitySets, IDictionary2 knownTypes) at System.Data.Services.Providers.ObjectContextServiceProvider.PopulateMetadata(IDictionary2 knownTypes, IDictionary2 entitySets) at System.Data.Services.Providers.BaseServiceProvider.P

    Read the article

  • What is the best way to update an unattached entity on Entity Framework?

    - by Carlos Loth
    Hi, In my project I have some data classes to retrieve data from the database using the Entity Framework. We called these classes *EntityName*Manager. All of them have a method to retrieve entities from database and they behave most like this: static public EntityA SelectByName(String name) { using (var context = new ApplicationContext()) { var query = from a in context.EntityASet where a.Name == name select a; try { var entityA = query.First(); context.Detach(entityA); return entityA; } catch (InvalidOperationException ex) { throw new DataLayerException( String.Format("The entityA whose name is '{0}' was not found.", name), ex); } } } You can see that I detach the entity before return it to the method caller. So, my question is "what is the best way to create an update method on my *EntityA*Manager class?" I'd like to pass the modified entity as a parameter of the method. But I haven't figured out a way of doing it without going to the database and reload the entity and update its values inside a new context. Any ideas? Thanks in advance, Carlos Loth.

    Read the article

  • Business entity: private instance VS single instance

    - by taoufik
    Suppose my WinForms application has a business entity Order, the entity is used in multiple views, each view handles a different domain or use-case in the application. As an example, one managing orders, the other one digging into one order and displaying additional data. If I'd use nHibernate (or any other ORM) and use one session/dataContext per view (or per db action), I'd end up getting two different instances for the same Order (let's say orderId = 1). Although functionally the same entity, they are technically two different instances. Yes, I could implement Equals/GetHashcode to make them "seem" the same. Why would you go for a single instance per entity vs private instances per view or per use-case? Having single instances has the advantage of sharing INotifyPropertyChanged events, and sharing additional (non-persistent) data. Having a private instance in each view would give you the flexibility of the undo functionality on a view level. In the example above, I'd allow the user to change order details, and give them the flexibility to not save the change. Here, synchronisation between the view/use-case happens on a data persistence level. What would your argument be?

    Read the article

  • Entity Framework in layered architecture

    - by Kamyar
    I am using a layered architecture with the Entity Framework. Here's What I came up with till now (All the projects Except UI are class library): Entities: The POCO Entities. Completely persistence ignorant. No Reference to other projects. Generated by Microsoft's ADO.Net POCO Entity Generator. DAL: The EDMX (Entity Model) file with the context class. (t4 generated). References: Entities BLL: Business Logic Layer. Will implement repository pattern on this layer. References: Entities, DAL. This is where the objectcontext gets populated: var ctx=new DAL.MyDBEntities(); UI: The presentation layer: ASP.NET website. References: Entities, BLL + a connection string entry to entities in the config file (question #2). Now my three questions: Is my layer discintion approach correct? In my UI, I access BLL as follows: var customerRep = new BLL.CustomerRepository(); var Customer = customerRep.GetByID(myCustomerID); The problem is that I have to define the entities connection string in my UI's web.config/app.config otherwise I get a runtime exception. IS defining the entities connectionstring in UI spoils the layers' distinction? Or is it accesptible in a muli layered architecture. Should I take any additional steps to perform chage tracking, lazy loading, etc (by etc I mean the features that Entity Framework covers in a conventional, 1 project, non POCO code generation)? Thanks and apologies for the lengthy question.

    Read the article

  • Advice Please: SQL Server Identity vs Unique Identifier keys when using Entity Framework

    - by c.batt
    I'm in the process of designing a fairly complex system. One of our primary concerns is supporting SQL Server peer-to-peer replication. The idea is to support several geographically separated nodes. A secondary concern has been using a modern ORM in the middle tier. Our first choice has always been Entity Framework, mainly because the developers like to work with it. (They love the LiNQ support.) So here's the problem: With peer-to-peer replication in mind, I settled on using uniqueidentifier with a default value of newsequentialid() for the primary key of every table. This seemed to provide a good balance between avoiding key collisions and reducing index fragmentation. However, it turns out that the current version of Entity Framework has a very strange limitation: if an entity's key column is a uniqueidentifier (GUID) then it cannot be configured to use the default value (newsequentialid()) provided by the database. The application layer must generate the GUID and populate the key value. So here's the debate: abandon Entity Framework and use another ORM: use NHibernate and give up LiNQ support use linq2sql and give up future support (not to mention get bound to SQL Server on DB) abandon GUIDs and go with another PK strategy devise a method to generate sequential GUIDs (COMBs?) at the application layer I'm leaning towards option 1 with linq2sql (my developers really like linq2[stuff]) and 3. That's mainly because I'm somewhat ignorant of alternate key strategies that support the replication scheme we're aiming for while also keeping things sane from a developer's perspective. Any insight or opinion would be greatly appreciated.

    Read the article

  • Alternatives to the Entity Framework for Serving/Consuming an OData Interface

    - by Egahn
    I'm researching how to set up an OData interface to our database. I would like to be able to pull/query data from our DB into Excel, as a start. Eventually I would like to have Excel run queries and pull data over HTTP from a remote client, including authentication, etc. I've set up a working (rickety) prototype so far, using the ADO.NET Entity Data Model wizard in Visual Studio, and VSTO to create a test Excel worksheet with a button to pull from that ADO.NET interface. This works OK so far, and I can query the DB using Linq through the entities/objects that are created by the ADO.NET EDM wizard. However, I have started to run into some problems with this approach. I've been finding the Entity Framework difficult to work with (and in fact, also difficult to research solutions to, as there's a lot of chaff out there regarding it and older versions of it). An example of this is my being unable to figure out how to set the SQL command timeout (as opposed to the HTTP request timeout) on the DataServiceContext object that the wizard generates for my schema, but that's not the point of my question. The real question I have is, if I want to use OData as my interface standard, am I stuck with the Entity Framework? Are there any other solutions out there (preferably open source) which can set up, serve and consume an OData interface, and are easier to work with and less bloated than the Entity Framework? I have seen mention of NHibernate as an alternative, but most of the comparison threads I've seen are a few years old. Are there any other alternatives out there now? Thanks very much!

    Read the article

  • Entity Framework 4 + POCO with custom classes and WCF contracts (serialization problem)

    - by eman
    Yesterday I worked on a project where I upgraded to Entity Framework 4 with the Repository pattern. In one post, I have read that it is necessary to turn off the custom tool generator classes and then write classes (same like entites) by hand. That I can do it, I used the POCO Entity Generator and then deleted the new generated files .tt and all subordinate .cs classes. Then I wrote the "entity classes" by myself. I added the repository pattern and implemented it in the business layer and then implemented a WCF layer, which should call the methods from the business layer. By calling an Insert (Add) method from the presentation layer and everything is OK. But if I call any method that should return some class, then I get an error like (the connection was interrupted by the server). I suppose there is a problem with the serialization or am I wrong? How can by this problem solved? I'm using Visual Studio S2010, Entity Framework 4, C#. UPDATE: I have uploaded the project and hope somebody can help me! link text UPDATE 2: My questions: Why is POCO good (pros/cons)? When should POCO be used? Is POCO + the repository pattern a good choice? Should POCO classes by written by myself or could I use auto generated POCO classes?

    Read the article

  • Strange JPA one-to-many behavior when trying to set the "many" on the "one" entity

    - by errr
    I've mapped two entities using JPA (specifically Hibernate). Those entities have a one-to-many relationship (I've simplified for presentation): @Entity public class A { @ManyToOne public B getB() { return b; } } @Entity public Class B { @OneToMany(mappedBy="b") public Set<A> getAs() { return as; } } Now, I'm trying to create a relationship between two instances of these entities by using the setter of the one-side/not-owner-side of the relationship (i.e the table being referenced to): em.getTransaction().begin(); A a = new A(); B b = new B(); Set<A> as = new HashSet<A>(); as.add(a); b.setAs(as); em.persist(a); em.persist(b); em.getTransaction().commit(); But then, the relationship isn't persisted to the DB (the row created for entity A isn't referencing the row created for entity B). Why is it so? I'd excpect it to work. Also, if I remove the "mappedBy" property from the @OneToMany annotation it will work. Again - why is it so? and what are the possible effects for removing the "mappedBy" property?

    Read the article

  • Prroblem with ObjectDelete() in Entity Framework 4

    - by Tom
    I got two entities: public class User : Entity { public virtual string Username { get; set; } public virtual string Email { get; set; } public virtual string PasswordHash { get; set; } public virtual List<Photo> Photos { get; set; } } and public class Photo : Entity { public virtual string FileName { get; set; } public virtual string Thumbnail { get; set; } public virtual string Description { get; set; } public virtual int UserId { get; set; } public virtual User User { get; set; } } When I try to delete the photo it works fine for the DB (record gets romoved from the table) but it doesnt for the Photos collection in the User entity (I can still see that photo in user.Photos). This is my code. What I'm doing wrong here? Changing entity properties works fine. When I change photo FileName for example it gets updated in DB and in user.Photos. var photo = SelectById(id); Context.DeleteObject(photo); Context.SaveChanges(); var user = GetUser(userName); // the photo I have just deleted is still in user.Photos also tried this but getting same results: var photo = user.Photos.First(); Context.DeleteObject(photo); Context.SaveChanges();

    Read the article

  • ReflectionTypeLoadException when I try to run Enable-Migrations with Entity Framework 5.0

    - by Eric Anastas
    I'm trying to use Entity Framework for the first time on one of my projects. I'm using the code first workflow to automatically create my database. Intitaly setting up the database worked fine. Now I'm trying to migrate changes in my classes into the database. The tutorial I'm reading says I need to run "Enable-Migrations" in the package manager console. Yet when I do this I get the following error PM> Enable-Migrations System.Reflection.ReflectionTypeLoadException: Unable to load one or more of the requested types. Retrieve the LoaderExceptions property for more information. at System.Reflection.RuntimeModule.GetTypes(RuntimeModule module) at System.Reflection.RuntimeModule.GetTypes() at System.Reflection.Assembly.GetTypes() at System.Data.Entity.Migrations.Design.ToolingFacade.BaseRunner.FindType[TBase](String typeName, Func`2 filter, Func`2 noType, Func`3 multipleTypes, Func`3 noTypeWithName, Func`3 multipleTypesWithName) at System.Data.Entity.Migrations.Design.ToolingFacade.GetContextTypeRunner.RunCore() at System.Data.Entity.Migrations.Design.ToolingFacade.BaseRunner.Run() Unable to load one or more of the requested types. Retrieve the LoaderExceptions property for more information. What am I doing wrong? How do I retrieve the loader exceptions property? Also NuGet says I have EF 5.0, but Version property of the EntityFramework item in my project references says 4.4.0.0. I'm not sure if this is related.

    Read the article

  • Entity framework self referencing loop detected

    - by Lyd0n
    I have a strange error. I'm experimenting with a .NET 4.5 Web API, Entity Framework and MS SQL Server. I've already created the database and set up the correct primary and foreign keys and relationships. I've created a .edmx model and imported two tables: Employee and Department. A department can have many employees and this relationship exists. I created a new controller called EmployeeController using the scaffolding options to create an API controller with read/write actions using Entity Framework. In the wizard, selected Employee as the model and the correct entity for the data context. The method that is created looks like this: // GET api/Employee public IEnumerable<Employee> GetEmployees() { var employees = db.Employees.Include(e => e.Department); return employees.AsEnumerable(); } When I call my API via /api/Employee, I get this error: ...The 'ObjectContent`1' type failed to serialize the response body for content type 'application/json; ...System.InvalidOperationException","StackTrace":null,"InnerException":{"Message":"An error has occurred.","ExceptionMessage":"Self referencing loop detected with type 'System.Data.Entity.DynamicProxies.Employee_5D80AD978BC68A1D8BD675852F94E8B550F4CB150ADB8649E8998B7F95422552'. Path '[0].Department.Employees'.","ExceptionType":"Newtonsoft.Json.JsonSerializationException","StackTrace":" ... Why is it self referencing [0].Department.Employees? That doesn't make a whole lot of sense. I would expect this to happen if I had circular referencing in my database but this is a very simple example. What could be going wrong?

    Read the article

  • Oracle Utilities Application Framework future feature deprecation

    - by Paula Speranza-Hadley
    From time to time, existing functionality is replaced with alternative features to offer greater flexibility and standardization. In Oracle Utilities Application Framework V4.2.0.0.0 the following features are being announced for deprecation in the next release or have been previously announced and are not being delivered with this version of the Oracle Utilities Application Framework: ·         No SQL Server Support – Oracle Utilities Application Framework V4.2.0.0.0 or above does not ship with any support for SQL Server. ·         No MPL Support – Oracle Utilities Application Framework V4.2.0.0.0 or above does not ship with the Multi-Purpose Listener (MPL) component of the XML Application Integration (XAI) component. Customers using the MPL should migrate to Oracle Service Bus. ·         No provided Crystal Reports/Business Objects Interface – Oracle Utilities Application Framework V4.2.0.0.0 or above does not ship with a supported Crystal Reports/Business Objects Interface. This facility is now available as downloadable customization for existing or new customers. Responsibility for maintenance and new features is now individual customer's responsibility. ·         XAI Servlet deprecation – The XAI Servlet (xaiserver and classicxai) will be removed in the next release of the Oracle Utilities Application Framework. Customers are encouraged to migrate to the native Web Services Support as outlined in XAI Best Practices whitepaper available from My Oracle Support (Doc Id: 942074.1). ·         ConfigLab deprecation – The ConfigLab facility will be removed in the next release of Oracle Utilities Application Framework for products it is shipped with. Customers are recommended to migrate to the Configuration Migration Assistant which provides the same and more functionality.   ·         Archiving deprecation – The inbuilt Archiving has been removed from Oracle Utilities Application Framework V4.2.0.0.0 or above, for products it is shipped with. Customers considering Archiving solution should migrate to the Information Lifecycle Management based solution provided for your product. ·         DISTRIBUTED batch execution mode deprecation – The DISTRIBUTED execution mode used by the batch component of the Oracle Utilities Application Framework will be deprecated in the next release of the Oracle Utilities Application Framework. Customers using DISTRUBUTED mode should migrate to CLUSTERED mode as outlined in the Batch Best Practices For Oracle Utilities Application Framework Based Products whitepaper available from My Oracle Support (Doc Id: 836362.1). ·         XAI Schema Editor deprecation – The XAI Schema Editor which is a component of the Oracle Utilities Software Development Kit will be removed in the next release of the Oracle Utilities Application Framework. Customers should migrate their existing schemas to Business Object based schemas and use the browser based Schema Editor instead.  

    Read the article

  • Entity Framework 4, WCF &amp; Lazy Loading Tip

    - by Dane Morgridge
    If you are doing any work with Entity Framework and custom WCF services in EFv1, everything works great.  As soon as you jump to EFv4, you may find yourself getting odd errors that you can’t seem to catch.  The problem is almost always has something to do with the new lazy loading feature in Entity Framework 4.  With Entity Framework 1, you didn’t have lazy loading so this problem didn’t surface.  Assume I have a Person entity and an Address entity where there is a one-to-many relationship between Person and Address (Person has many Addresses). In Entity Framework 1 (or in EFv4 with lazy loading turned off), I would have to load the Address data by hand by either using the Include or Load Method: var people = context.People.Include("Addresses"); or people.Addresses.Load(); Lazy loading works when the first time the Person.Addresses collection is accessed: 1: var people = context.People.ToList(); 2:  3: // only person data is currently in memory 4:  5: foreach(var person in people) 6: { 7: // EF determines that no Address data has been loaded and lazy loads 8: int count = person.Addresses.Count(); 9: } 10:  Lazy loading has the useful (and sometimes not useful) feature of fetching data when requested.  It can make your life easier or it can make it a big pain.  So what does this have to do with WCF?  One word: Serialization. When you need to pass data over the wire with WCF, the data contract is serialized into either XML or binary depending on the binding you are using.  Well, if I am using lazy loading, the Person entity gets serialized and during that process, the Addresses collection is accessed.  When that happens, the Address data is lazy loaded.  Then the Address is serialized, and the Person property is accessed, and then also serialized and then the Addresses collection is accessed.  Now the second time through, lazy loading doesn’t kick in, but you can see the infinite loop caused by this process.  This is a problem with any serialization, but I personally found it trying to use WCF. The fix for this is to simply turn off lazy Loading.  This can be done at each call by using context options: context.ContextOptions.LazyLoadingEnabled = false; Turning lazy loading off will now allow your classes to be serialized properly.  Note, this is if you are using the standard Entity Framework classes.  If you are using POCO,  you will have to do something slightly different.  With POCO, the Entity Framework will create proxy classes by default that allow things like lazy loading to work with POCO.  This proxy basically creates a proxy object that is a full Entity Framework object that sits between the context and the POCO object.  When using POCO with WCF (or any serialization) just turning off lazy loading doesn’t cut it.  You have to turn off the proxy creation to ensure that your classes will serialize properly: context.ContextOptions.ProxyCreationEnabled = false; The nice thing is that you can do this on a call-by-call basis.  If you use a new context for each set of operations (which you should) then you can turn either lazy loading or proxy creation on and off as needed.

    Read the article

  • Upgraded to EF6 blew up Universal provider session state for Azure

    - by Ryan
    I have an ASP.NET MVC 4 application that using the Universal providers for session state: <sessionState mode="Custom" sqlConnectionString="DefaultConnection" customProvider="DefaultSessionProvider"> <providers> <add name="DefaultSessionProvider" type="System.Web.Providers.DefaultSessionStateProvider, System.Web.Providers, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" connectionStringName="DefaultConnection" /> </providers> </sessionState> When I upgraded to entity framework 6 I now get this error: Method not found: 'System.Data.Objects.ObjectContext System.Data.Entity.Infrastructure.IObjectContextAdapter.get_ObjectContext()'. I tried adding the reference to System.Data.Entity.dll back in but that didn't work and I know that your not suppose to add that with the new entity framework..

    Read the article

  • OCUnit testing an embedded framework

    - by d11wtq
    I've added a Unit Test target to my Xcode project but it fails to find my framework when it builds, saying: Test.octest could not be loaded because a link error occurred. It is likely that dyld cannot locate a framework framework or library that the the test bundle was linked against, possibly because the framework or library had an incorrect install path at link time. My framework (the main project target) is designed to be embedded and so has an install path of @executable_path/../Frameworks. I've marked the framework as a direct dependency of the test target and I've added it to the "Link Binary with Libraries" build phase. Additionally I've add a first step (after it's built the dependency) of "Copy Files" which simply copies the framework to the unit test bundle's Frameworks directory. Anyone got any experience on this? I'm not sure what I've missed.

    Read the article

  • Visual Studio 2010 and Target Framework Version

    - by Scott Dorman
    Almost two years ago, I wrote about a Visual Studio macro that allows you to change the Target Framework version of all projects in a solution. If you don’t know, the Target Framework version is what tells the compiler which version of the .NET Framework to compile against (more information is available here) and can be set to one of the following values: .NET Framework 2.0 .NET Framework 3.0 .NET Framework 3.5 .NET Framework 3.5 Client Profile .NET Framework 4.0 .NET Framework 4.0 Client Profile This can be easily accomplished by editing the project properties: The problem with this approach is that if you need to change a lot of projects at one time it becomes rather unwieldy. One possible solution is to edit the project files by hand in a text editor and change the <TargetFrameworkVersion /> and <TargetFrameworkProfile /> properties to the correct values. For example, for the .NET Framework 4.0 Client Profile, these values would be: <TargetFrameworkVersion>v4.0</TargetFrameworkVersion> <TargetFrameworkProfile>Client</TargetFrameworkProfile> Again, this is not only time consuming but can also be error-prone. The better solution is to automate this through the use of a Visual Studio macro. Since I had already created a macro to do this for Visual Studio 2008, I updated that macro to work with Visual Studio 2010 and .NET 4.0. It prompts you for the target framework version you want to set for all of the projects and then loops through each project in the solution and makes the change. If you select one of the Framework versions that support a Client Profile, it will ask if you want to use the Client Profile or the Full Profile. It is smart enough to skip project types that don’t support this property and projects that are already at the correct version. This version also incorporates the changes suggested by George (in the comments). The macro is available on my SkyDrive account. Download it to your <UserProfile>\Documents\Visual Studio 2010\Projects\VSMacros80\MyMacros folder, open the Visual Studio Macro IDE (Alt-F11) and add it as an existing item to the “MyMacros” project. I make no guarantees or warranties on this macro. I have tested it on several solutions and projects and everything seems to work and not cause any problems, but, as always, use with caution. Since it is a macro, you have the full source code available to investigate and see what it’s actually doing. If you find any bugs or make any useful changes, please let me know and I’ll update the macro. Technorati Tags: Macros,Visual Studio

    Read the article

< Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >