Search Results

Search found 7270 results on 291 pages for 'sqlite extension'.

Page 176/291 | < Previous Page | 172 173 174 175 176 177 178 179 180 181 182 183  | Next Page >

  • Which full-text search package should I use for SQLite3?

    - by Benjamin Pollack
    SQLite3 appears to come with three different full-text search engines, called FTS1, FTS2, and FTS3. The documentation available on the website mentions that FTS1 is stable, FTS2 is in development, and that you should use FTS2. Examples I find online use FTS3, which is in CVS, and not documented versus FTS2. None of the full-text search engines come with the amalgamated source, as near as I can tell. So, my question: which of these three engines, if any, should I use for full-text indexing in SQLite? Or should I simply use a third-party tool like Sphinx, or a custom solution in Lucene, instead?

    Read the article

  • How should I manage data in an 2D vector based animation program?

    - by shadow
    I've been trying to design a program that makes 2D animations and then uses the ffmpeg library to create the video for possible use in tv and movies. The problem is when I think about how to manage the data in the application I can only think of two ways, I don't think either of them will work out very well. One is to use an SQlite database, but it seems like it will be difficult to save, especially if an artist puts 1000 things on screen. The other is to use something like linked lists, which would duplicate many features of the database and get complicated when dealing with things like points on a bezier curve and jumping to a frame and collecting all the objects that need to be drawn on that frame. Should I use one of these solutions, or is there something else that would be better? Currently planning to use C# for code.

    Read the article

  • large amount of data in many text files - how to process?

    - by Stephen
    Hi, I have large amounts of data (a few terabytes) and accumulating... They are contained in many tab-delimited flat text files (each about 30MB). Most of the task involves reading the data and aggregating (summing/averaging + additional transformations) over observations/rows based on a series of predicate statements, and then saving the output as text, HDF5, or SQLite files, etc. I normally use R for such tasks but I fear this may be a bit large. Some candidate solutions are to 1) write the whole thing in C (or Fortran) 2) import the files (tables) into a relational database directly and then pull off chunks in R or Python (some of the transformations are not amenable for pure SQL solutions) 3) write the whole thing in Python Would (3) be a bad idea? I know you can wrap C routines in Python but in this case since there isn't anything computationally prohibitive (e.g., optimization routines that require many iterative calculations), I think I/O may be as much of a bottleneck as the computation itself. Do you have any recommendations on further considerations or suggestions? Thanks

    Read the article

  • Problem with between join for sqlalchemy orm relation.

    - by Gary van der Merwe
    I'm trying to create a relation that has a between join. Here is a shortish example of what I'm trying to do: #!/usr/bin/env python import sqlalchemy as sa from sqlalchemy import orm from sqlalchemy.engine.base import Engine from sqlalchemy.ext.declarative import declarative_base metadata = sa.MetaData() Base = declarative_base(metadata=metadata) engine = sa.create_engine('sqlite:///:memory:') class Network(Base): __tablename__ = "network" id = sa.Column(sa.Integer, primary_key=True) ip_net_addr_db = sa.Column('ip_net_addr', sa.Integer, index=True) ip_broadcast_addr_db = sa.Column('ip_broadcast_addr', sa.Integer, index=True) # This can be determined from the net address and the net mask, but we store # it in the db so that we can join with the address table. ip_net_mask_len = sa.Column(sa.SmallInteger) class Address(Base): __tablename__ = "address" ip_addr_db = sa.Column('ip_addr', sa.Integer, primary_key=True, index=True, unique=True) Network.addresses = orm.relation(Address, primaryjoin=Address.ip_addr_db.between( Network.ip_net_addr_db, Network.ip_broadcast_addr_db), foreign_keys=[Address.ip_addr_db]) metadata.create_all(engine) Session = orm.sessionmaker(bind=engine) Network() I you run this, you get this error: ArgumentError: Could not determine relation direction for primaryjoin condition 'address.ip_addr BETWEEN network.ip_net_addr AND network.ip_broadcast_addr', on relation Network.addresses. Do the columns in 'foreign_keys' represent only the 'foreign' columns in this join condition ? The answer to that question is Yes, but I cant figure out how to tell it that

    Read the article

  • Core Data and many Entity

    - by mr.octobor
    I'm newbie and I must save "Ranking" and "Level" of user. I create file Ranking.xcdatamodel for save "Ranking" with entity name Ranking (property is Rank, Name) I can save and show it. but when I create entity Level (property is CurrentLevel) my program is crash and show this message Unresolved error Error Domain=NSCocoaErrorDomain Code=134100 UserInfo=0x60044b0 "Operation could not be completed. (Cocoa error 134100.)", { metadata = { NSPersistenceFrameworkVersion = 248; NSStoreModelVersionHashes = { Users = ; }; NSStoreModelVersionHashesVersion = 3; NSStoreModelVersionIdentifiers = ( ); NSStoreType = SQLite; NSStoreUUID = "41225AD0-B508-4AA7-A5E2-15D6990FF5E7"; "_NSAutoVacuumLevel" = 2; }; reason = "The model used to open the store is incompatible with the one used to create the store"; } I don't know how to save "Level" please suggest me.

    Read the article

  • FMDB transaction

    - by user142764
    Hi ! I use FMDB to wrap SQLite in my app. I haven't found any docs about the use of methods begin, beginUpdates, commit, finalize, etc. I face some problems in my apps which i think are caused by the way i use transactions. Here is what i tried : [FMDB beginUpdates] - My insert statement - [FMDB commit] [FMDB finalize] it crashes with this log : Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[FMDatabase<0xd705a0 finalize]: called when collecting not enabled' Could you please give me an example of how you are using transactions, or point me to a doc ? Thanks in advance, Vincent.

    Read the article

  • Select from multiple tables, remove duplicates

    - by staze
    I have two tables in a SQLite DB, and both have the following fields: idnumber, firstname, middlename, lastname, email, login One table has all of these populated, the other doesn't have the idnumber, or middle name populated. I'd LIKE to be able to do something like: select idnumber, firstname, middlename, lastname, email, login from users1,users2 group by login; But I get an "ambiguous" error. Doing something like: select idnumber, firstname, middlename, lastname, email, login from users1 union select idnumber, firstname, middlename, lastname, email, login from users2; LOOKS like it works, but I see duplicates. my understanding is that union shouldn't allow duplicates, but maybe they're not real duplicates since the second user table doesn't have all the fields populated (e.g. "20, bob, alan, smith, [email protected], bob" is not the same as "NULL, bob, NULL, smith, [email protected], bob"). Any ideas? What am I missing? All I want to do is dedupe based on "login". Thanks!

    Read the article

  • How to keep the session of user login?

    - by YaW
    Hi, I have an app that requires user to register. I've got the app conected to PHP in my server to do the logic of register/login and so, this is not problem at all. But, I want to keep a session of the user in local, so the user doesn't have to login again every time he runs the app. So, I want something like this: First time user, he opens the app, register and login. Do some stuff with the app and closes it. Open the app again, the user is recognized so he doesn't need to login again. I only need to store an ID and a Username (both fetched from the DB in the login php method). What's the best way to do this?? I've thought about doing a custom preference, store the data in files or even a local DB (sqlite). Any suggestions?

    Read the article

  • Does the Fetch Request do the recommended batch faulting?

    - by dontWatchMyProfile
    I'm curious. Apple says in the docs: Core Data automatically fires faults when necessary (when a persistent property of a fault is accessed). However, firing faults individually can be inefficient, and there are better strategies for getting data from the persistent store (see “Batch Faulting and Pre-fetching with the SQLite Store”). NSFetchRequest has this feature: [fetchRequest setFetchBatchSize:20]; Is this essentially performing such a batch faulting like recommended? Just to make this clear for others, faulting does not mean "turning into a fault" but it means "materializing it", just like "making a Scooby-Doo out of it". Pretty ugly wording error, in my opinion, but it's at least consistent in the docs ;)

    Read the article

  • Add Core Data Index to certain Attributes via migration

    - by steipete
    For performance reasons, i want to set the Indexed Attribute to some of my entities. I created a new core data model version to perform the changes. Core Data detects the changes and migrates my model to the new version, however, NO INDEXES ARE GENERATED. If I recreate the database from scratch, the indexes are there. I checked with SQLite Browser both on the iPhone and on the Simulator. The problem only occurs if a database in the prior format is already there. Is there a way to manually add the indexes? Write some sql for that? Or am I missing something? I did already some more critical migrations, no problems there. But those missing indexes are bugging me. Thanks for helping!

    Read the article

  • Fluent NHib causing visual studio 2010 hanging at runtime

    - by Berryl
    Just installed and migrated a 2008 solution on Vista ultimate 64 and .net 4.0. Everything builds and tests run surprisingly well but I got the hang description below while trying to run the app under SQLite. It turns out that the hang has got something to do when the call is made for FNH to build the session factory during a run, the only feedback I get is that the database wasn't configure properly without any inner exception. The strange part is that the exact code works perfectly under tests. Any clues? Description: A problem caused this program to stop interacting with Windows. Problem signature: Problem Event Name: AppHangB1 Application Name: devenv.exe Application Version: 10.0.30319.1 Application Timestamp: 4ba1fab3 Hang Signature: b9ed Hang Type: 6152 OS Version: 6.0.6002.2.2.0.256.1 Locale ID: 1033 Additional Hang Signature 1: 005de38e6b4bb3afd8e147932c6431cc Additional Hang Signature 2: d54c Additional Hang Signature 3: 05f671c8289bf8dd31e6ccfe265baa77 Additional Hang Signature 4: 784c Additional Hang Signature 5: c8207f54dadf3eb38dfcf1ae152f4229 Additional Hang Signature 6: ff83 Additional Hang Signature 7: 220932152f3f04fffb6ca3abf15e6dc6

    Read the article

  • Launching Vim via Lua

    - by Keith Pimmel
    I'm writing a simple little Lua commandline app that will build a static website. I'm storing my fragments in a sqlite database. Retrieving the data from the db is straightforward as is saving it; my question comes from editing the data. Is there an elegant way to pipe the data from Lua to vim? Can vim edit a memory buffer and return it? I was planning on launching the editor via os.execute('vim') but only after grabbing a temporary file handle and dumping the database output into that. I would like to have to avoid touching the filesystem that way but that is my contingency plan.

    Read the article

  • How to add data manually in core data entity

    - by pankaj
    Hi I am working on core data for the first time. I have just created an entity and attributes for that entity. I want to add some data inside the entity(u can say i want to add data in a table), earlier i when i was using sqlite, i would add data using terminal. But here in core data i am not able to find a place where i can manually add data. I just want to add data in entity and display it in a UITableView. I have gone through the the documentation of core data but it does not explain how to add data manually although it explains how i can add it programmiticaly but i dont need to do it programically. I want to do it manually. Thanks in advance

    Read the article

  • Iterator blocks in Clojure?

    - by Checkers
    I am using clojure.contrib.sql to fetch some records from an SQLite database. (defn read-all-foo [] (with-connection *db* (with-query-results res ["select * from foo"] (into [] res)))) Now, I don't really want to realize the whole sequence before returning from the function (i.e. I want to keep it lazy), but if I return res directly or wrap it some kind of lazy wrapper (for example I want to make a certain map transformation on result sequence), SQL-related bindings will be reset and connection will be closed after I return, so realizing the sequence will throw an exception. How can I enclose the whole function in a closure and return a kind of iterator block (like yield in C# or Python)? Or is there another way to return a lazy sequence from this function?

    Read the article

  • Android Activities UI Persistence

    - by aandroid
    I need to have two activities in an Android app that can be switched between each other with UI persistence as follows: Activity A launches Activity B. User triggers some UI changes in Activity B. Activity B returns to Activity A (by a call to onBackPressed() or something similar) Activity A re-launches Activity B. I would like the changes made in step 2 to be visible in step 4. I have tried using the singleInstance activity tag on Activity B to no avail. I would also prefer a more elegant solution than simply writing all object properties to a file or SQLite table. It seems that this behaviour must be easily achievable given that Android does it automatically for calls to onBackPressed() where the parent Activity's UI is saved. Any help is much appreciated.

    Read the article

  • How to install Nokogiri as a Macruby gem?

    - by Jakub Hampl
    The latest MacRuby release notes (v0.6) state that the authors have managed to get this release working with the SQLite and Nokogiri gems. However when I run sudo macgem install nokogiri I get the following errors: ERROR: Error installing nokogiri: extconf failed: and then a bunch of paths followed by: libxml2 is missing. try 'port install libxml2' or 'yum install libxml2' /Library/Frameworks/MacRuby.framework/Versions/0.6/usr/lib/ruby/Gems/1.9.0/gems/nokogiri-1.4.1/ext/nokogiri/extconf.rb:1:in `<main>': libxml2 is missing. try 'port install libxml2' or 'yum install libxml2' (SystemExit) Anyone knows how to get this working? My platform is Mac OS X 10.6.3. Nokogiri normally (meaining on plain old ruby 1.8.7) installs without a problem.

    Read the article

  • Importing large datasets on iPhone using CoreData

    - by Matthes
    Hi there, I'm facing very annoying problem. My iPhone app is loading it's data from a network server. Data are sent as plist and when parsed, it neeeds to be stored to SQLite db using CoreData. Issue is that in some cases those datasets are too big (5000+ records) and import takes way too long. More on that, when iPhone tries to suspend the screen, Watchdog kills the app because it's still processing the import and does not respond up to 5 seconds, so import is never finished. I used all recommended techniques according to article "Efficiently Importing Data" http://developer.apple.com/mac/library/DOCUMENTATION/Cocoa/Conceptual/CoreData/Articles/cdImporting.html and other docs concerning this, but it's still awfully slow. Solution I'm looking for is to let app suspend, but let import run in behind (better one) or to prevent attempts to suspend the app at all. Or any better idea is welcomed too. Any tips on how to overcome these issues are highly appreciated! Thanks

    Read the article

  • SQLce Select query problem

    - by DieHard
    Wrote a Truck show Contest voting app, financial etc using sqlite. decided to write backup app for show day using ce 3.5. Created db moved to data directory, created tables configured dgridviews all is well. Entered some test data started management studio 08 ran select query against table and got null returns. Started app from vs studio and found that test data is gone. Re entered data ran query in MS data gone again. If I use VS Studio can start and enter data, close app restart and data is still there, seems only when using outside tool on select query data deletes. I don't know ce that well but this cannot be right. select * from votes = delete * from votes??????????????

    Read the article

  • Exception - Illegal Block size during decryption(Android)

    - by Vamsi
    I am writing an application which encrypts and decrypts the user notes based on the user set password. i used the following algorithms for encryption/decryption 1. PBEWithSHA256And256BitAES-CBC-BC 2. PBEWithMD5And128BitAES-CBC-OpenSSL e_Cipher = Cipher.getInstance(PBEWithSHA256And256BitAES-CBC-BC); d_Cipher = Cipher.getInstance(PBEWithSHA256And256BitAES-CBC-BC); e_Cipher.init() d_Cipher.init() encryption is working well, but when trying to decrypt it gives Exception - Illegal Block size after encryption i am converting the cipherText to HEX and storing it in a sqlite database. i am retrieving correct values from the sqlite database during decyption but when calling d_Cipher.dofinal() it throws the Exception. I thought i missed to specify the padding and tried to check what are the other available cipher algorithms but i was unable to found. so request you to please give the some knowledge on what are the cipher algorithms and padding that are supported by Android? if the algorithm which i used can be used for padding, how should i specify the padding mechanism? I am pretty new to Encryption so tried a couple of algorithms which are available in BouncyCastle.java but unsuccessful. As requested here is the code public class CryptoHelper { private static final String TAG = "CryptoHelper"; //private static final String PBEWithSHA256And256BitAES = "PBEWithSHA256And256BitAES-CBC-BC"; //private static final String PBEWithSHA256And256BitAES = "PBEWithMD5And128BitAES-CBC-OpenSSL"; private static final String PBEWithSHA256And256BitAES = "PBEWithMD5And128BitAES-CBC-OpenSSLPBEWITHSHA1AND3-KEYTRIPLEDES-CB"; private static final String randomAlgorithm = "SHA1PRNG"; public static final int SALT_LENGTH = 8; public static final int SALT_GEN_ITER_COUNT = 20; private final static String HEX = "0123456789ABCDEF"; private Cipher e_Cipher; private Cipher d_Cipher; private SecretKey secretKey; private byte salt[]; public CryptoHelper(String password) throws InvalidKeyException, NoSuchAlgorithmException, NoSuchPaddingException, InvalidAlgorithmParameterException, InvalidKeySpecException { char[] cPassword = password.toCharArray(); PBEKeySpec pbeKeySpec = new PBEKeySpec(cPassword); PBEParameterSpec pbeParamSpec = new PBEParameterSpec(salt, SALT_GEN_ITER_COUNT); SecretKeyFactory keyFac = SecretKeyFactory.getInstance(PBEWithSHA256And256BitAES); secretKey = keyFac.generateSecret(pbeKeySpec); SecureRandom saltGen = SecureRandom.getInstance(randomAlgorithm); this.salt = new byte[SALT_LENGTH]; saltGen.nextBytes(this.salt); e_Cipher = Cipher.getInstance(PBEWithSHA256And256BitAES); d_Cipher = Cipher.getInstance(PBEWithSHA256And256BitAES); e_Cipher.init(Cipher.ENCRYPT_MODE, secretKey, pbeParamSpec); d_Cipher.init(Cipher.DECRYPT_MODE, secretKey, pbeParamSpec); } public String encrypt(String cleartext) throws IllegalBlockSizeException, BadPaddingException { byte[] encrypted = e_Cipher.doFinal(cleartext.getBytes()); return convertByteArrayToHex(encrypted); } public String decrypt(String cipherString) throws IllegalBlockSizeException { byte[] plainText = decrypt(convertStringtobyte(cipherString)); return(new String(plainText)); } public byte[] decrypt(byte[] ciphertext) throws IllegalBlockSizeException { byte[] retVal = {(byte)0x00}; try { retVal = d_Cipher.doFinal(ciphertext); } catch (BadPaddingException e) { Log.e(TAG, e.toString()); } return retVal; } public String convertByteArrayToHex(byte[] buf) { if (buf == null) return ""; StringBuffer result = new StringBuffer(2*buf.length); for (int i = 0; i < buf.length; i++) { appendHex(result, buf[i]); } return result.toString(); } private static void appendHex(StringBuffer sb, byte b) { sb.append(HEX.charAt((b>>4)&0x0f)).append(HEX.charAt(b&0x0f)); } private static byte[] convertStringtobyte(String hexString) { int len = hexString.length()/2; byte[] result = new byte[len]; for (int i = 0; i < len; i++) { result[i] = Integer.valueOf(hexString.substring(2*i, 2*i+2), 16).byteValue(); } return result; } public byte[] getSalt() { return salt; } public SecretKey getSecretKey() { return secretKey; } public static SecretKey createSecretKey(char[] password) throws NoSuchAlgorithmException, InvalidKeySpecException { PBEKeySpec pbeKeySpec = new PBEKeySpec(password); SecretKeyFactory keyFac = SecretKeyFactory.getInstance(PBEWithSHA256And256BitAES); return keyFac.generateSecret(pbeKeySpec); } } I will call mCryptoHelper.decrypt(String str) then this results in Illegal block size exception My Env: Android 1.6 on Eclipse

    Read the article

  • SQLAlchemy automatically converts str to unicode on commit

    - by Victor Stanciu
    Hello, When inserting an object into a database with SQLAlchemy, all it's properties that correspond to String() columns are automatically transformed from <type 'str'> to <type 'unicode'>. Is there a way to prevent this behavior? Here is the code: from sqlalchemy import create_engine, Table, Column, Integer, String, MetaData from sqlalchemy.orm import mapper, sessionmaker engine = create_engine('sqlite:///:memory:', echo=False) metadata = MetaData() table = Table('projects', metadata, Column('id', Integer, primary_key=True), Column('name', String(50)) ) class Project(object): def __init__(self, name): self.name = name mapper(Project, table) metadata.create_all(engine) session = sessionmaker(bind=engine)() project = Project("Lorem ipsum") print(type(project.name)) session.add(project) session.commit() print(type(project.name)) And here is the output: <type 'str'> <type 'unicode'> I know I should probably just work with unicode, but this would involve digging through some third-party code and I don't have the Python skills for that yet :)

    Read the article

  • How to fetch data for AutoCompleteTextView in separate thread?

    - by Laimoncijus
    For my AutoCompleteTextView I need to fetch the data from a webservice. As it can take a little time I do not want UI thread to be not responsive, so I need somehow to fetch the data in a separate thread. For example, while fetching data from SQLite DB, it is very easy done with CursorAdapter method - runQueryOnBackgroundThread. I was looking around to other adapters like ArrayAdapter, BaseAdapter, but could not find anything similar... Is there an easy way how to achieve this? I cannot simply use ArrayAdapter directly, as the suggestions list is dynamic - I always fetch the suggestions list depending on user input, so it cannot be pre-fetched and cached for further use... If someone could give some tips or examples on this topic - would be great!

    Read the article

  • Why is django admin not accepting Nullable foreign keys?

    - by p.g.l.hall
    Here is a simplified version of one of my models: class ImportRule(models.Model): feed = models.ForeignKey(Feed) name = models.CharField(max_length=255) feed_provider_category = models.ForeignKey(FeedProviderCategory, null=True) target_subcategories = models.ManyToManyField(Subcategory) This class manages a rule for importing a list of items from a feed into the database. The admin system won't let me add an ImportRule without selecting a feed_provider_category despite it being declared in the model as nullable. The database (SQLite at the moment) even checks out ok: >>> .schema ... CREATE TABLE "someapp_importrule" ( "id" integer NOT NULL PRIMARY KEY, "feed_id" integer NOT NULL REFERENCES "someapp_feed" ("id"), "name" varchar(255) NOT NULL, "feed_provider_category_id" integer REFERENCES "someapp_feedprovidercategory" ("id"), ); ... I can create the object in the python shell easily enough: f = Feed.objects.get(pk=1) i = ImportRule(name='test', feed=f) i.save() ...but the admin system won't let me edit it, of course. How can I get the admin to let me edit/create objects without specifying that foreign key?

    Read the article

  • SQLAlchemy custom sorting algorithms when using SQL indexes

    - by David M
    Is it possible to write custom collation functions with indexes in SQLAlchemy? SQLite for example allows specifying the sorting function at a C level as sqlite3_create_collation(). An implementation of some of the Unicode collation algorithm has been provided by James Tauber here, which for example sorts all the "a"'s close together whether they have accents on them or not. Other examples of why this might be useful is for different alphabet orders (languages other than English) and sorting numeric values (sorting 10 after 9 rather than codepoint order.) Is this possible in SQLAlchemy? If not, is it supported by the pysqlite3 or MySQLdb modules, or for any other SQL database modules supported by python for that matter? Any information would be greatly appreciated.

    Read the article

  • Linq to NHibernate using Queryable.Where predicate

    - by Groo
    I am querying an SQLite database using LINQ to NHibernate. Person is an entity containing an Id and a Name: public class Person { public Guid Id { get; private set; } public string Name { get; private set; } } Let's say my db table contains a single person whose name is "John". This test works as expected: var query = from item in session.Linq<Person>() where (item.Name == "Mike") select item; // no such entity should exist Assert.IsFalse(query.Any()); but this one fails: var query = from item in session.Linq<Person>() select item; query.Where(item => item.Name == "Mike"); // following line actually returns the // "John" entry Assert.IsFalse(query.Any()); What am I missing?

    Read the article

  • Are there Python ORMs out there that support multiple independent databases concurrently in use?

    - by sdt
    I'm writing an application in Python where I wish to use sqlite as the backing store for documents edited by the app, with documents generally living in memory, but being saved to disk-based databases when the application saves. Ideally I'd like to use something like an ORM to make access to the data from my Python application code simple. Unfortunately it seems like the majority of Python ORMs, including SQLAlchemy, SQLObject, Django, and Storm, associate the database connection (or engine or whatever) with the classes representing table data, rather than instances of those classes. This restricts these ORMs to working with a single database connection across all instances. Since I'd like to support having multiple documents open simultaneously, this isn't going to work for me. Are there any ORMs out there that support this usage model in Python? Bazaar seems to support this, but it's quite out of date, and at first glance appears to have some other shortcomings. Thanks for any suggestions!

    Read the article

< Previous Page | 172 173 174 175 176 177 178 179 180 181 182 183  | Next Page >