Search Results

Search found 30279 results on 1212 pages for 'database drift'.

Page 884/1212 | < Previous Page | 880 881 882 883 884 885 886 887 888 889 890 891  | Next Page >

  • Is tcerl for Mnesia production ready? Is there any alternatives?

    - by Sanoj
    I would like to create a scalable web service using Mnesia as database. However Mnesia per default isn't scalable for persistent storgage since it is using Dets (which has a 2GB limit) as backend. I have seen discussions about extending Mnesia with MnesiaEx and use tcerl as backend. It sounds good and have showed good performance. However, I have seen in a talk about Tokyo Cabinet and CouchDB with Mnesia that there are some issues: issues with durability issues with memory leaks issues with crashes Is tcerl + Mnesia really production ready? And is there any other alternatives? How doe´s companies overcome these issues if they use Mnesia in bigger systems? Is there a working solution with Mnesia and Tokyo Tyrant that is working better?

    Read the article

  • SQL Server CE - Internal error: Cannot open the shared memory region

    - by blu
    I have a SQL Server CE database that works fine in dev, but when installed on the client has an issue. The SQL Server CE 3.5 dependencies are copied as part of the deployment. The target machine is a clean Windows 7 32-bit Ultimate image. The message for the exception in the event log is: Internal error: Cannot open the shared memory region. It looks like this is SSCE_M_CANTOPENSHAREDMEMORY and the site says there isn't a connection string value to change this and that these issues are typically not resolvable by the end developers. Has anyone run into this, and if so were you able to resolve this issue?

    Read the article

  • User management for google apps

    - by Ali
    Hi guys, I'm modifying our collaboration system so it can be listed on google applications. A small issue I'm facing is the registering of user details. By default whenever someone logs into their google Apps account they pretty much are logged into the application. For every action taken by a registered login in user I store the user ID of that signed in user whenever an update is made in the database. However the google apps user sign in process is different in this respect that there isn't anything visible as a user ID for me to work with. Any ideas?

    Read the article

  • Using Phing's dbdeploy task with transactions

    - by Gordon
    I am using Phing's dbdeploy task to manage my database schema. This is working fine, as long as there is no errors in my delta file. However, if there is an error, dbdeploy will just run the delta files up to the query with the error and then abort. This causes me some frustration, because I have to manually rollback the entry in the changelog table then. If I don't, dbdeploy will assume the migration was successful on a subsequent try. So the question is, there any way to get dbdeploy use transactions?

    Read the article

  • Using LINQ to fetch result from nested SQL queries

    - by Shantanu Gupta
    This is my first question and first day in Linq so bit difficult day for me to understand. I want to fetch some records from database i.e. select * from tblDepartment where department_id in ( select department_id from tblMap where Guest_Id = @GuestId ) I have taken two DataTable. i.e. tblDepartment, tblMap Now I want to fetch this result and want to store it in third DataTable. How can I do this. I have been able to construct this query up till now after googling. var query = from myrow in _dtDepartment.AsEnumerable() where myrow.Field<int>("Department_Id") == _departmentId select myrow; Please provide me some link for learning Linq mainly for DataTables and DataSets. EDIT: I have got a very similar example here but i m still not able to understand how it is working. Please put some torch on it.

    Read the article

  • PHP, Codeigniter: How to Set Date/Time based on users timezone/location globally in a web app?

    - by Abs
    Hello all, I have just realised if I add a particular record to my MySQL database - it will have a date/time of the server and not the particular user and where they are located which means my search function by date is useless! As they will not be able to search by when they have added it in their timezone rather when it was added in the servers timezone. Is there a way in Codeigniter to globally set time and date specific to a users location (maybe using their IP) and every time I call date() or time() that users timezone is used. What I am actually asking for is probably how to make my application dependent on each users timezone? Maybe its better to store each users timezone in their profile and have a standard time (servers time) and then convert the time to for each user? Thanks all

    Read the article

  • Customizing Rails XML rendering to include extra properties

    - by Isaac Cambron
    Let's say I have a model like this: create_table :ninjas do |t| t.string name end And the Ninja class with an extra property: class Ninja < ActiveRecord::Base def honorific "#{name}san" end end And in my controller I just want to render it to XML: def show render :xml => Ninja.find(params[:id]) end The honorific part isn't rendered. That makes sense, since it's just a method, but is there a way of tricking it? I'm totally up for answers to the effect of, "You're doing this totally wrong." I'll just add that I really do want to calculate the honorific on the fly, and not, like, store it in the database or something.

    Read the article

  • ODBC: Mapping of literal type names in create table statements

    - by matthias-meyer
    I was wondering if data types in a a literal "create table" statement, executed over ODBC, are replaced with their database specific counterparts (platform is Windows/.Net/C#). I cannot find this feature in the ODBC docs, and there seems to be no list of literal "ODBC data types". However, I know that this works for Oracle, SQL Server and Access; the following statement is executed correctly, although the type LONGVARBINARY is no native type in all of these systems: CREATE TABLE (MYCOLUMN LONGVARBINARY) However, e.g. for Oracle the mapped native type depends on the used ODBC driver. Is this an undocumented feature? Is there a list of supported type names anywhere? Thanks!

    Read the article

  • How do you implement a combobox filter using AJAX in ASP.NET?

    - by geocine
    To save some time on discussing my problem you could check the demo below: http://demos.telerik.com/aspnet-ajax/combobox/examples/functionality/filteringcombo/defaultcs.aspx I already checked the ListBoxExtender on the Ajax Control Toolkit but it wouldn't give me fine results. What I want to do is to filter a listbox which is populated by over 3000 records from the database upon typing. It should not only filter the listbox with the starting letters but also the group of characters which could be found in between each item on the list. The list is a list of Item Name as a value and an Item Code as the key.

    Read the article

  • Programatically find TFS changes since last good build

    - by abigblackman
    I have several branches in TFS (dev, test, stage) and when I merge changes into the test branch I want the automated build and deploy script to find all the updated SQL files and deploy them to the test database. I thought I could do this by finding all the changesets associated with the build since the last good build, finding all the sql files in the changesets and deploying them. However I don't seem to be having the changeset associated with the build for some reason so my question is twofold: 1) How do I ensure that a changeset is associated with a particular build? 2) How can I get a list of files that have changed in the branch since the last good build? I have the last successfully built build but I'm unsure how to get the files without checking the changesets (which as mentioned above are not associated with the build!)

    Read the article

  • Making a DataSet from another DataSet

    - by M.H
    Hi folks I have a client-server project (small project for companies in C#) and the server has a DataSet with some tables (there is no Database for some reasons so we save the DataSet as an XML file). when the clients connect to the server, the server should send some informations to the client depends on his privileges and some clients must add to or Delete from the DataSet in the server. I am thinking in Making a new small DataSet and sending it to the client (as xml) but I don't know how to generate a new DataSet with specific tables and rows (I tried to use Linq to DataSet but nothing worked). My Questions is how can I do that and is this a good solution to send informations to clients ? can you suggest a better scenario to send data to clients(I mean instead of making a new DataSet).

    Read the article

  • Calling Entity Framework function import from code

    - by Mikey Cee
    So I have a stored procedure called Spr_EventLogCreate defined in my database. I have created a function import in my data model called LogEvent with no return type, and I can see this function in the Model Browser tree at MyModel.edmx MyModel EntityContainer Function Imports LogEvent. I thought I should then be able to call the function in my code as follows: var context = new MyModelEntities(); context.LogEvent(...); But the LogEvent() method is not present. I must be being really stupid here, but how do I call my imported function? Using VS 2008 and EF 3.5.

    Read the article

  • Notification between J2EE components.

    - by Pratik
    Hi There! I have a design problem . My application has multiple J2EE components ,In simple terms one acts as a service provider(Non UI) and others are consumers(UI webapp) . The consumer gets the configuration data from the service provider(this basically reads the data from DB) during the start up and stores it in the Cache. The cache gets refreshed after periodic time to reflect any changes done at the database. The Problem Apart from the cache refresh ,I also want to notify the consumers when someone changes the DB . that configuration has been changed please reload it. What notification mechanism's can I use to achieve this. Thanks! Pratik

    Read the article

  • SSAS 2008 backup/restore fails with a GetOverlappedResult 'Insufficient system resources exist to co

    - by Anant Aneja
    Hi, On my SSAS 2008 instance if a backup/restore of any database is made to/from a UNC path I get an error : The following system error occurred from a call to GetOverlappedResult for Physical file: '\server\share\OLAPDB.abf', Logical file: '' : Insufficient system resources exist to complete the requested service. . Server: The operation has been cancelled. (Microsoft.AnalysisServices) Creating/copying/moving a file on the share of any size on the share using explorer or the command prompt works. The most useful link I could find is : http://www.tech-archive.net/Archive/Development/microsoft.public.win32.programmer.kernel/2004-07/0475.html Can anyone shed more light on what could be causing this error ? (I've posted the same question on the SSAS forums - just a heads up)

    Read the article

  • Maintain set of local commits working with git-svn

    - by benizi
    I am using git to develop against a project hosted in subversion, using git-svn: git svn clone svn://project/ My general workflow has been to repeatedly edit-and-commit on the master branch, then commit to the svn repository via: git stash git svn dcommit git stash apply One of the local modifications that 'stash' command is preserving, that I don't want to commit to the svn repository, is a changed database connection string. What's the most convenient way to keep this local change without the extra 'stash' steps? I suspect that something like 'stash' or 'quilt' is what I'm looking for, but I'm still new enough to git that I think I'm missing some terminology that would lead to the exact incantation. Update: The only solution I found that seems to avoid the git stash + git-svn action + git stash apply series was to update the git-svn ref manually: (check in local-only change to 'master', then...) $ cat .git/refs/master > .git/refs/remote/git-svn $ git svn fetch (with at least one new SVN revision) And that leaves the local-only commit as a weird (probably unsafe) commit between two svn revisions.

    Read the article

  • Core Data performance deleteObject and save managed object context

    - by Gary
    I am trying to figure out the best way to bulk delete objects inside of my Core Data database. I have some objects with a parent/child relationship. At times I need to "refresh" the parent object by clearing out all of the existing children objects and adding new ones to Core Data. The 'delete all' portion of this operation is where I am running into trouble. I accomplish this by looping through the children and calling deleteObject for each one. I have noticed that after the NSManagedObjectContext:Save call following all of the deleteObject calls is very slow when I am deleting 15,000 objects. How can I speed up this call? Are there things happening during the save operation that I can be aware of and avoid by setting parameters different or setting up my model another way? I've noticed that memory spikes during this operation as well. I really just want to "delete * from". Thanks.

    Read the article

  • TIBCO ActiveDatabase Error

    - by George
    Folks, I am with the following error when trying to start an instance of the Publish ActiveDatabase. I'm using Designer 5.7.2 and trying to access a database of SqlServer 2008. Someone saw this error before? I don´t find any reference in Google or other sites! Adaptador_M2M.Adaptador_M2M Error [Adapter] AEADB-910005 Startup Error. SDK Exception Code = AESDKC-0087, Category = Metadata, Severity = errorRole, Description = Class description not available for: ADB_PREREGLISTENER, File = C:/suren/workspace/Maverick/maverick-5.6.1-dev/libmaverick/MInstanceImpl.cpp, line = 71 received on starting the adapter after initialization. The Repository URL is D:\TEMP\AT_adadb_61214.dat and the Configuration URL is Corporativo/IntegracaoM2M/Adaptadores/Adaptador_M2M.

    Read the article

  • Difference of HTTP Components, Restlet, Apache Mina and Netty

    - by dexter
    I used HTTPComponents to implement a custom web server that access SQLite database. Requests are sent via TCP/IP and I am using REST concepts. By the way my frontend is HTML/jQuery. I know it will be a lot easier if I'll just create a servlet but I am restricted to just using apache http server. I really don't get good performance in using HTTP Components. Any suggestions please. Thanks in advance.

    Read the article

  • Data refresh and drill down problem with SSAS cube and excel services

    - by chaitanya
    I have a SSAS cube which I am using in an excel document, prepare a report which has drill-down etc and i am publishing it to a sharepoint site. It gets published alright but when I try to drill down it throws an error "Data Refresh failed" etc.The data source and the sharpoint site are on the same machine(running windows server 2008) and we have windows authentication running. From what I have been able to find on the internet there is a problem with passing the windows authentication credentials to the database etc.But I have not been able to find the exact way to sort out these problem. What is the solution for this????

    Read the article

  • Technique for ensuring HTML- and URL-encoding

    - by JW
    Has anyone implemented a good template system for ensuring that output is properly HTML-encoded where it makes sense? Maybe even something that recognizes when output should be URL-encoded or JSON-encoded instead? The lazy approach — just encoding all inputs — causes problems when you want to send those inputs to a database, or to a block of JavaScript code. So something a little smarter is needed. The tedious approach — putting the proper encoding function around each piece of data on the template — works, but it's easy for developers to forget to do it. Is there a good approach that makes it easy for developers, and ensures that the right encoding is done? I was listening to one of the SO podcasts, and Joel tossed out an idea about using typed data to enforce a difference between HTML-encoded strings and non-encoded strings. Maybe that could be a starting point. I'm looking more for a strategy than for an implementation in a particular language (although I'd be happy to hear about implementations that already exist and work).

    Read the article

  • Slow SQL Sync with Microsoft Sync Framework on Mobile Client

    - by Malkier
    Hello, we are developing an application which uses MS Sync Framework to sync data between Windows CE 6.0 with SQL CE 3.5 SP1 Clients and an SQL 2008 Database. Our major problem is a slow sync time up to 1 minute for 15 tables which are totally empty. Here's a break down of our components: Server: Sql Server 2008 15 tables with activated change tracking WCF Service with endpoint for the mobile sync (uses Sync Framework 2.0) Client (Mobile) Windows CE 6.0 NET Application using Sync Framework for Devices (CTP 1) which starts the sync As I mentioned above, the sync takes up to 1 minute without any changes and empty tables. The mobile device is in its dock. This is a deal breaker for a production environment. Does anybody have any experience in this field? Is there a way to improve things? Thanks for any responses.

    Read the article

  • Dumping views with mysqldump in the right order.

    - by Bushibytes
    I have a script that backs up our database, which contains multiple tables and views constructed from tables. The command used is: mysqldump -u UserName -ppassword -h hostname DatabaseName dump.sql; I have noticed however that some view definitions are backed up before the definitions of the tables. This causes an issue when restoring using the classic mysql -u UserName -p < dump.sql As when it tries to create the view, the table it needs does not exist yet. It is possible to edit the dump files to be restored, but I was wondering: Is there a way to either make sure that mysqldump backs up the tables and views in the right order? Or is there a way to restore from a dump that will find the right tables to create first (or create sane temporary tables)? Edit for version: mysqldump Ver 10.11 Distrib 5.0.51b, for redhat-linux-gnu (x86_64)

    Read the article

  • OWC does not work with IE8

    - by mactov
    Hi, I have a web page that is generated with Access 2003 and uses Office Web Components. It worked fine with IE6 and IE7 but does not work anymore with IE8. Here are more details. I create an MSODSC component and a WSH object to get my ConnectionString in the registry. Then the Dropdown Lists are fed by a request to the database. It works perfect with IE6 and IE7, works locally with IE8 but if the page is served by IIS to IE8, the drop down lists are empty. Can anyone help me ? Thanks Mactov

    Read the article

  • Do MSDTC and disaster recovery go together?

    - by DevDelivery
    Our application writes to multiple Sql Server databases within a distributed transaction. The Ops guys are saying that this messes up their disaster recovery plan because while the transactions on the live tables may commit at the same time, the log shipping on the separate databases happen at slightly different times. So in in a disaster recovery situation, there will be a few partial transactions. Is there a method for maintaining separate but synced databases in DR? Or do we have to re-design to relatively independent databases (or a single database)?

    Read the article

  • Configuring asp.net web applications. Best practices

    - by Andrew Florko
    Hello everybody, There is a lot of configurable information for a web-site: UI messages Number of records used in pagination & other UI parameters Cache duration for web-pages & timeouts Route maps & site structure ... There are many approaches to store all this information also: AppSettings (web.config) Custom sections (web.config) External xml/text files referred from web.config Internal static class(es) of constants Database table(s) ... What approaches do you usually choose for your tasks & what approaches do you find unsuitable? Thank you in advance!

    Read the article

< Previous Page | 880 881 882 883 884 885 886 887 888 889 890 891  | Next Page >