Search Results

Search found 2952 results on 119 pages for 'dependencies'.

Page 85/119 | < Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >

  • I'm getting the error in iTunes connect: The binary you uploaded was invalid. The signature was inva

    - by Joshua
    I went through the dev portal provisioning process twice now trying to get it to work, but to no avail. I don't think it's the second half (signature is invalid), I think it actually may have to with my binary. I have a warning in xcode that isn't helping me because I don't know what to do about it. And honestly I don't know how relevant this information even is. But it says: "Check Dependencies: Warning: The copy bundle resources build phase contains target's info.plist" The app runs perfectly in the simulator, and I haven't made any changes to the info.plist since I submitted the app to Apple last week. (this is an update) any suggestions?

    Read the article

  • Can EPD Python and MacPorts Python coexist on OS X (matplotlib)?

    - by bjoern
    I've been using MacPorts Python 2.6 on OS X 10.6. I am considering also installing the Enthought Python Distribution (EPD) on the same machine because it comes preconfigured with matplotlib and other nice data analysis and visualization packages. Can the two Python distributions co-exist peacefully on the same machine? What potential problems will I have to look out for (e.g., environment variables)? I know that building matplotlib through MacPorts is an option, but the process is lengthy (on the order of a full day) and there are open questions about compiling some dependencies on 64bit Intel. I would like to know about the tradeoffs before committing to one of the two approaches.

    Read the article

  • <jaxrs:client> not getting autowired

    - by himangshu
    I am trying to build a restful client using jaxrs:client as defined in http://svn.apache.org/repos/asf/cxf/trunk/systests/jaxrs/src/test/resources/jaxrs_soap_rest/WEB-INF/beans.xml In my test class I am getting org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'com.abc.service.ExportServiceTest': Injection of autowired dependencies failed; nested exception is org.springframework.beans.factory.BeanCreationException: Could not autowire field: private com.bankbazaar.service.ExportService com.abc.service.ExportServiceTest.exportClient; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No matching bean of type [com.abc.service.ExportService] found for dependency: expected at least 1 bean which qualifies as autowire candidate for this dependency. Dependency annotations: {@org.springframework.beans.factory.annotation.Autowired(required=true), @org.springframework.beans.factory.annotation.Qualifier(value=exportClient)} this is my spring config However exportClient=(ExportService)applicationContext.getBean("exportClient"); this works. Thanks Himangshu

    Read the article

  • Makefile option/rule to handle missing/removed source files

    - by b3nj1
    http://stackoverflow.com/questions/239004/need-a-makefile-dependency-rule-that-can-handle-missing-files gives some pointers on how to handle removed source files for generating .o files. I'm using gcc/g++, so adding the -MP option when generating dependencies works great for me, until I get to the link stage with my .a file... What about updating archives/libraries when input sources go away? This works OK for me, but is there a cleaner way (ie, something as straightforward as the g++ -MP option)? #BUILD_DIR is my target directory (includes Debug/Release and target arch) #SRC_OUTS are my .o files LIBATLS_HAS = $(shell nm ${BUILD_DIR}/libatls.a | grep ${BUILD_DIR} | sed -e 's/.*(//' -e 's/).*://') LIBATLS_REMOVE = $(filter-out $(notdir ${SRC_OUTS}), ${LIBATLS_HAS}) ${BUILD_DIR}/libatls.a: ${BUILD_DIR}/libatls.a(${SRC_OUTS}) ifneq ($(strip ${LIBATLS_REMOVE}),) $(AR) -d $@ ${LIBATLS_REMOVE} endif

    Read the article

  • How do I temporarily change the require path in Ruby ($:)?

    - by John Feminella
    I'm doing some trickery with a bunch of Rake tasks for a complex project, gradually refactoring away some of the complexity in chunks at a time. This has exposed the bizarre web of dependencies left behind by the previous project maintainer. What I'd like to be able to do is to add a specific path in the project to require's list of paths to be searched, aka $:. However, I only want that path to be searched in the context of one particular method. Right now I'm doing something like this: def foo() # Look up old paths, add new special path. paths = $: $: << special_path # Do work ... bar() baz() quux() # Reset. $:.clear $: << paths end def bar() require '...' # If called from within foo(), will also search special_path. ... end This is clearly a monstrous hack. Is there a better way?

    Read the article

  • How to install the program depending on libstdc++ library

    - by Alex Farber
    My program is written in C++, using GCC on Ubuntu 9.10 64 bit. If depends on /usr/lib64/libstdc++.so.6 which actually points to /usr/lib64/libstdc++.so.6.0.13. Now I copy this program to virgin Ubuntu 7.04 system and try to run it. It doesn't run, as expected. Then I add to the program directory the following files: libstdc++.so.6.0.13 libstdc++.so.6 (links to libstdc++.so.6.0.13) and execute command: LD_LIBRARY_PATH=. ./myprogram Now everything is OK. The question: how can I write installation script for such program? myprogram file itself should be placed to /usr/local/bin. What can I do with dependencies? For example, on destination computer, /usr/lib64/libstdc++.so.6 link points to /usr/lib64/libstdc++.so.6.0.8. What can I do with this? Note: the program is closed-source, I cannot provide source code and makefile.

    Read the article

  • Tips on how to deploy C++ code to work every where

    - by User1
    I'm not talking about making portable code. This is more a question of distribution. I have a medium-sized project. It has several dependencies on common libraries (eg openssl, zlib, etc). It compiles fine on my machine and now it's time to give it to the world. Essentially build engineering at its finest. I want to make installers for Windows, Linux, MacOSX, etc. I want to make a downloadable tar ball that will make the code work with a ./configure and a make (probably via autoconf). It would be icing on the cake to have a make option that would build the installers..maybe even cross-compile so a Windows installer could be built in Linux. What is the best strategy? Where can I expect to spend the most time? Should the prime focus be autoconf or are there other tools that can help?

    Read the article

  • Quick way to do data lookup in PHP

    - by Ghostrider
    I have a data table with 600,000 records that is around 25 megabytes large. It is indexed by a 4 byte key. Is there a way to find a row in such dataset quickly with PHP without resorting to MySQL? The website in question is mostly static with minor PHP code and no database dependencies and therefore fast. I would like to add this data without having to use MySQL if possible. In C++ I would memory map the file and do a binary search in it. Is there a way to do something similar in PHP?

    Read the article

  • Making OR/M loosely coupled and abstracted away from other layers.

    - by Genuine
    Hi all. In an n-tier architecture, the best place to put an object-relational mapping (OR/M) code is in the data access layer. For example, database queries and updates can be delegated to a tool like NHibernate. Yet, I'd like to keep all references to NHibernate within the data access layer and abstract dependencies away from the layers below or above it. That way, I can swap or plug in another OR/M tool (e.g. Entity Framework) or some approach (e.g. plain vanilla stored procedure calls, mock objects) without causing compile-time errors or a major overhaul of the entire application. Testability is an added bonus. Could someone please suggest a wrapper (i.e. an interface or base class) or approach that would keep OR/M loosely coupled and contained in 1 layer? Or point me to resources that would help? Thanks.

    Read the article

  • Simple 2 way encryption for C#

    - by Matt Dawdy
    I'm looking for very simple encrypt and decrypt functionality for some data. It's not mission critical. I need something to keep honest people honest, but something a little stronger than ROT13 or Base64. I'd prefer something that is already included in the .Net framework 2.0 so I don't have to worry about any external dependencies. Pre-emptive EDIT: I really don't want to have to mess around with public/private keys, etc. I don't know much about encryption, but I do enough to know that anything I wrote would be less than worthless...in fact, I'd probably screw up the math and make it trivial to crack.

    Read the article

  • Python unittest with expensive setup

    - by Staale
    My test file is basically: class Test(unittest.TestCase): def testOk(): pass if __name__ == "__main__": expensiveSetup() try: unittest.main() finally: cleanUp() However, I do wish to run my test through Netbeans testing tools, and to do that I need unittests that don't rely on an environment setup done in main. Looking at http://stackoverflow.com/questions/402483/caching-result-of-setup-using-python-unittest - it recommends using Nose. However, I don't think Netbeans supports this. I didn't find any information indicating that it does. Additionally, I am the only one here actually writing tests, so I don't want to introduce additional dependencies for the other 2 developers unless they are needed. How can I do the setup and cleanup once for all the tests in my TestSuite? The expensive setup here is creating some files with dummy data, as well as setting up and tearing down a simple xml-rpc server. I also have 2 test classes, one testing locally and one testing all methods over xml-rpc.

    Read the article

  • How can unit testing make parameter validation redundant?

    - by Johann Gerell
    We have a convention to validate all parameters of constructors and public functions/methods. For mandatory parameters of reference type, we mainly check for non-null and that's the chief validation in constructors, where we set up mandatory dependencies of the type. The number one reason why we do this is to catch that error early and not get a null reference exception a few hours down the line without knowing where or when the faulty parameter was introduced. As we start transitioning to more and more TDD, some team members feel the validation is redundant. Uncle Bob, who is a vocal advocate of TDD, strongly advices against doing parameter validation. His main argument seems to be "I have a suite of unit tests that makes sure everything works". But I can for the life of it just not see in what way unit tests can prevent our developers from calling these methods with bad parameters in production code. Please, unit testers out there, if you could explain this to me in a rational way with concrete examples, I'd be more than happy to seize this parameter validation!

    Read the article

  • ASP.Net MVC Web App not running

    - by Aidan Host
    Hi, I developed an ASP.Net MVC v1 web application and it ran fine on our server. The client wanted to move to another server, and the site does not run on the new host's server. The new server specs: Windows 2008 ASP.Net Framework v4 ASP.Net MVC 2 (afaik its included with .Net v4) IIS 7.5 (afaik) Error Message: Could not load file or assembly 'System.Web.Mvc, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The system cannot find the file specified. My understanding is that it should be backwards compatible. Is the app really trying to run in ASP.Net MVC v1 when v2 is available? I have already tried deploying the MVC .dlls to the Bin folder, but it did not work. I also tried changing all the system.Web.Mvc Version values (in the web.config for the web app) from 1.0.0.0 to 2.0.0.0, which also did not work. Any assistance will be greatly appreciated.

    Read the article

  • Compiled ASP.NET application can be viewed locally only

    - by cfdev9
    I have a development website running on my local machine, I can access it locally by typing the address http://mycomputer.mynetwork.local/myapp/default.aspx however when anybody else tries to browse to it they get an error: Server Error in '/' Application. The resource cannot be found. Description: HTTP 404. The resource you are looking for (or one of its dependencies) could have been removed, had its name changed, or is temporarily unavailable. Please review the following URL and make sure that it is spelled correctly. I'm using IIS7, ASP.NET 3.5 and the application is pre-compiled. Any hints?

    Read the article

  • Always require a plugin to be loaded

    - by axon
    I am writing an application with RequireJS and Knockout JS. The application includes an extension to knockout that adds ko.protectedObservable to the main knockout object. I would like this to always be available on the require'd knockout object, and not only when I specify it in the dependencies. I could concat the files together, but that seems that it should be unnecessary. Additionally, I can put knockout-protectedObservable as a dependency for knockout in the requirejs shim configuration, but that just leads to a circular dependency and it all fails to load. Edit: I've solved me issue, but seems hacky, is there a better way? -- Main.html <script type="text/javascript" src="require.js"></script> <script type="text/javascript"> require(['knockout'], function(ko) { require(['knockout-protectedObservable']); }); </script> -- knockout-protectedObservable.js define(['knockout'], function(ko) { ko.protectedObservable = { ... }; });

    Read the article

  • Compile C++ from VS08/10 without Run Time Library / MFC

    - by Lienau
    Are there settings I can adjust in Visual Studio so that it does not compile with any run time library or MFC. I started learning C++ to get away from C#'s .Net, and this is just as bad. When I execute the program in a Windows XP virtual machine I get an error. I can compile without the dependencies in Code::Blocks, but I'm more familiar with VS, and prefer many of it's features over those of Code::Blocks. If you know of to get past this, it would be greatly appreciated. Thanks.

    Read the article

  • Using Maven to build JAR from source in Subversion trunk

    - by Anonymouse
    There's a Java library that I would like to use in my project. My project uses Maven to pull in dependencies and it works great for everything except this one library. The problem is, this library never has releases. The author maintains the source in a Subversion repository and only makes changes in trunk. Is there a way I can tell Maven to Update (or check out) the library's source tree from Subversion Build it according to its POM Use the resulting jar as a dependency for this project Do this regularly (possibly at each build) For bonus points, mark which Subversion revision of that library I want to use Thanks!

    Read the article

  • are runtime linking library globals shared among plugins loaded with dlopen?

    - by conejoroy
    I've a C++ program that links at runtime with, lets say, mylib.so. then, the same program uses dlopen()/dlsym() to load a function from myplugin.so, dynamic library that in turn has dependencies to mylib.so. My question is: will the program AND the function in the plugin access the same globals defined in mydlib.so in the same memory area reserved for the program, or each will be assigned different, unrelated copies in its own memory space? if the latter is the default behaviour, is it possible to change that? Thanks in advance =)!

    Read the article

  • How to add .Net3.5 dll into .Net2.0 project?

    - by macias
    I have a dll which is based on .net 3.5 -- it uses internally for example Linq, but the exposed API is straightforward, no fancy stuff. Since C# generics are resolved at compile time I assume that for calling party all it counts is API (all public parts). However when I try to use this dll from net2.0 project I get info, that the dll cannot be referenced because the dll or one of its dependencies requires a later version of .net framework. I can install any .net version I want on target computer (when entire app is installed), but I cannot change .net version for the project itself. So: how to solve this? When adding a C dll to this project I had no such problems, so are C# dlls self-contained or not?

    Read the article

  • Ruby-Graphwiz does not render png

    - by auralbee
    I just tried the ruby-graphwiz gem (http://github.com/glejeune/Ruby-Graphviz). I followed the instructions (installed Graphwiz, gem and dependencies) and tried the example from the Github page. Unfortunately I am not able to render any output image (png,dot). # Create a new graph g = GraphViz.new( :G, :type => :digraph ) # Create two nodes hello = g.add_node( "Hello" ) world = g.add_node( "World" ) # Create an edge between the two nodes g.add_edge( hello, world ) # Generate output image g.output( :png => "hello_world.png" ) When I run the skript from the console I get no error message but also no output as expected. What could be the problem? Folders have read/write access for everybody. Thanks in advance. By the way, I´m working on a Mac (Leopard 10.6).

    Read the article

  • ant support for dynamic target

    - by Li He
    I previous saw some similar questions on stackoverflow but didn't see any solution. I guess the answer could be impossible and I am trying to see who can provide me this confirmation. AFAIK, an ant project contains several targets and each target may have several tasks. There is an task MacroDef that defines a sequential of `things' (tasks I suppose?). I tried to put target inside this block but ant complains the name of the target is missing (I am using the attribute of the macrodef to generate the name of the target). So it could be a dead end. Then I found that by using a task `script', we have access to the Project and can even call addTarget/AddOrReplaceTarget from there. But it seems that the targets I create there have no impact on the running targets. Does that mean ant doesn't support manipulating dependencies at target runtime? Is there any way to generate these targets before ant start building the dependency graph?

    Read the article

  • Archiving Database Tables using Java

    - by HonorGod
    My application demands archiving database tables between sybase and db2 and vice-a-versa and within(db2 to db2 and sybase to sybase) using java. I am trying to understand the best strategies around in terms performance, implementation, ease of use and scalability. Here is my current process - source and destination tables with the acceptable parameters (from java) are defined within xml. the application reads the source and destination configurations and execute them sequentially. destination is sometime optional when source is just deleting data from a specific table or when the source is just calling a stored procedure. dataset between source and destination is extremely large (in millions) From top of my head, it looks like I can define dependencies between multiple source and destination combination and have them execute in parallel in multiple treads. But will this improve any performance(i hope it will)? Are there any open-source frameworks for data archiving using java? Any other thoughts on the implements side will be really helpful. Thanks

    Read the article

  • designing multi module J2EE application

    - by user728947
    Might be my question is abstract or out of context, but i am asking here since i have little idea how this happens. I am wondering how big application/ platform break down there application in to multiple module and how they able to manage modules dependencies. For example in some E-commerce application they tend to break down it in various modules like pricing,promotions,shipping.import/export and many more. when we develop those application we hardly think about the underlying modules and how they have been designed to provides functionalists. Most of those module are not web-applications but are standalone module and not deployed in the web-app as jar files. can any one help me to understand how they break up things or is there any standard way to do this.any help/resources to get insight will really be helpful

    Read the article

  • Modifiers in Makefile rule's dependency list

    - by gnu_maker
    The problem is fairly simple. I am trying to write a rule, that given the name of the required file will be able to tailor its dependencies. Let's say I have two programs: calc_foo and calc_bar and they generate a file with output dependent on the parameter. My target would have a name 'target_*_*'; for example, 'target_foo_1' would be generated by running './calc_foo 1'. The question is, how to write a makefile that would generate outputs of the two programs for a range of parameters?

    Read the article

  • How can I make a workspace-folder level build script visible in the Eclipse Project Explorer?

    - by Chris
    I have a number of interdependent projects in an Eclipse workspace. Eclipse manages dependencies between them within the IDE but I'm starting work on a master build script that will sit in the folder about all the projects (the workspace folder). I haven't decided on if I will use Maven, Gradle or Ant/Ivy tet, but my question is, is there a way so that I can see a build script in the workspace folder in the Project/Package explorer? Currently it only shows me projects, but assuming I decide on an Ant build, I want to be able to see the main build.xml file in this window. I've played around with settings to no avail. Is it possible? If not, should I just set up an external run configuration instead?

    Read the article

< Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >