Search Results

Search found 13675 results on 547 pages for 'online repository'.

Page 1/547 | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Create a Remote Git Repository from an Existing XCode Repository

    - by codeWithoutFear
    Introduction Distributed version control systems (VCS’s), like Git, provide a rich set of features for managing source code.  Many development tools, including XCode, provide built-in support for various VCS’s.  These tools provide simple configuration with limited customization to get you up and running quickly while still providing the safety net of basic version control. I hate losing (and re-doing) work.  I have OCD when it comes to saving and versioning source code.  Save early, save often, and commit to the VCS often.  I also hate merging code.  Smaller and more frequent commits enable me to minimize merge time and effort as well. The work flow I prefer even for personal exploratory projects is: Make small local changes to the codebase to create an incrementally improved (and working) system. Commit these changes to the local repository.  Local repositories are quick to access, function even while offline, and provides the confidence to continue making bold changes to the system.  After all, I can easily recover to a recent working state. Repeat 1 & 2 until the codebase contains “significant” functionality and I have connectivity to the remote repository. Push the accumulated changes to the remote repository.  The smaller the change set, the less likely extensive merging will be required.  Smaller is better, IMHO. The remote repository typically has a greater degree of fault tolerance and active management dedicated to it.  This can be as simple as a network share that is backed up nightly or as complex as dedicated hardware with specialized server-side processing and significant administrative monitoring. XCode’s out-of-the-box Git integration enables steps 1 and 2 above.  Time Machine backups of the local repository add an additional degree of fault tolerance, but do not support collaboration or take advantage of managed infrastructure such as on-premises or cloud-based storage. Creating a Remote Repository These are the steps I use to enable the full workflow identified above.  For simplicity the “remote” repository is created on the local file system.  This location could easily be on a mounted network volume. Create a Test Project My project is called HelloGit and is located at /Users/Don/Dev/HelloGit.  Be sure to commit all outstanding changes.  XCode always leaves a single changed file for me after the project is created and the initial commit is submitted. Clone the Local Repository We want to clone the XCode-created Git repository to the location where the remote repository will reside.  In this case it will be /Users/Don/Dev/RemoteHelloGit. Open the Terminal application. Clone the local repository to the remote repository location: git clone /Users/Don/Dev/HelloGit /Users/Don/Dev/RemoteHelloGit Convert the Remote Repository to a Bare Repository The remote repository only needs to contain the Git database.  It does not need a checked out branch or local files. Go to the remote repository folder: cd /Users/Don/Dev/RemoteHelloGit Indicate the repository is “bare”: git config --bool core.bare true Remove files, leaving the .git folder: rm -R * Remove the “origin” remote: git remote rm origin Configure the Local Repository The local repository should reference the remote repository.  The remote name “origin” is used by convention to indicate the originating repository.  This is set automatically when a repository is cloned.  We will use the “origin” name here to reflect that relationship. Go to the local repository folder: cd /Users/Don/Dev/HelloGit Add the remote: git remote add origin /Users/Don/Dev/RemoteHelloGit Test Connectivity Any changes made to the local Git repository can be pushed to the remote repository subject to the merging rules Git enforces. Create a new local file: date > date.txt /li> Add the new file to the local index: git add date.txt Commit the change to the local repository: git commit -m "New file: date.txt" Push the change to the remote repository: git push origin master Now you can save, commit, and push/pull to your OCD hearts’ content! Code without fear! --Don

    Read the article

  • apt-get install and update fail

    - by sepehr
    I've got a problem with apt-get update and apt-get install ... commands . every time update or installing fails and errors are : Get:1 http://dl.google.com stable Release.gpg [198B] Ign http://dl.google.com/linux/chrome/deb/ stable/main Translation-en_US Get:2 http://dl.google.com stable Release [1,347B] Get:3 http://dl.google.com stable/main Packages [1,227B] Err http://32.repository.backtrack-linux.org revolution Release.gpg Could not connect to 32.repository.backtrack-linux.org:80 (37.221.173.214). - connect (110: Connection timed out) Err http://32.repository.backtrack-linux.org/ revolution/main Translation-en_US Unable to connect to 32.repository.backtrack-linux.org:http: Err http://32.repository.backtrack-linux.org/ revolution/microverse Translation-en_US Unable to connect to 32.repository.backtrack-linux.org:http: Err http://32.repository.backtrack-linux.org/ revolution/non-free Translation-en_US Unable to connect to 32.repository.backtrack-linux.org:http: Err http://32.repository.backtrack-linux.org/ revolution/testing Translation-en_US Unable to connect to 32.repository.backtrack-linux.org:http: Err http://all.repository.backtrack-linux.org revolution Release.gpg Could not connect to all.repository.backtrack-linux.org:80 (37.221.173.214). - connect (110: Connection timed out) Err http://all.repository.backtrack-linux.org/ revolution/main Translation-en_US Unable to connect to all.repository.backtrack-linux.org:http: Err http://all.repository.backtrack-linux.org/ revolution/microverse Translation-en_US Unable to connect to all.repository.backtrack-linux.org:http: Err http://all.repository.backtrack-linux.org/ revolution/non-free Translation-en_US Unable to connect to all.repository.backtrack-linux.org:http: Err http://all.repository.backtrack-linux.org/ revolution/testing Translation-en_US Unable to connect to all.repository.backtrack-linux.org:http: Ign http://32.repository.backtrack-linux.org revolution Release Ign http://all.repository.backtrack-linux.org revolution Release Ign http://32.repository.backtrack-linux.org revolution/main Packages Ign http://all.repository.backtrack-linux.org revolution/main Packages Ign http://32.repository.backtrack-linux.org revolution/microverse Packages Ign http://32.repository.backtrack-linux.org revolution/non-free Packages Ign http://32.repository.backtrack-linux.org revolution/testing Packages Ign http://all.repository.backtrack-linux.org revolution/microverse Packages Ign http://all.repository.backtrack-linux.org revolution/non-free Packages Ign http://all.repository.backtrack-linux.org revolution/testing Packages Ign http://32.repository.backtrack-linux.org revolution/main Packages Ign http://32.repository.backtrack-linux.org revolution/microverse Packages Ign http://32.repository.backtrack-linux.org revolution/non-free Packages Ign http://all.repository.backtrack-linux.org revolution/main Packages Ign http://all.repository.backtrack-linux.org revolution/microverse Packages Ign http://all.repository.backtrack-linux.org revolution/non-free Packages Ign http://all.repository.backtrack-linux.org revolution/testing Packages Err http://all.repository.backtrack-linux.org revolution/main Packages Unable to connect to all.repository.backtrack-linux.org:http: Err http://all.repository.backtrack-linux.org revolution/microverse Packages Unable to connect to all.repository.backtrack-linux.org:http: Ign http://32.repository.backtrack-linux.org revolution/testing Packages Err http://32.repository.backtrack-linux.org revolution/main Packages Unable to connect to 32.repository.backtrack-linux.org:http: Err http://32.repository.backtrack-linux.org revolution/microverse Packages Unable to connect to 32.repository.backtrack-linux.org:http: Err http://all.repository.backtrack-linux.org revolution/non-free Packages Unable to connect to all.repository.backtrack-linux.org:http: Err http://all.repository.backtrack-linux.org revolution/testing Packages Unable to connect to all.repository.backtrack-linux.org:http: Err http://32.repository.backtrack-linux.org revolution/non-free Packages Unable to connect to 32.repository.backtrack-linux.org:http: Err http://32.repository.backtrack-linux.org revolution/testing Packages Unable to connect to 32.repository.backtrack-linux.org:http: Err http://source.repository.backtrack-linux.org revolution Release.gpg Could not connect to source.repository.backtrack-linux.org:80 (37.221.173.214). - connect (110: Connection timed out) Err http://source.repository.backtrack-linux.org/ revolution/main Translation-en_US Unable to connect to source.repository.backtrack-linux.org:http: Err http://source.repository.backtrack-linux.org/ revolution/microverse Translation-en_US Unable to connect to source.repository.backtrack-linux.org:http: Err http://source.repository.backtrack-linux.org/ revolution/non-free Translation-en_US Unable to connect to source.repository.backtrack-linux.org:http: Err http://source.repository.backtrack-linux.org/ revolution/testing Translation-en_US Unable to connect to source.repository.backtrack-linux.org:http: Ign http://source.repository.backtrack-linux.org revolution Release Ign http://source.repository.backtrack-linux.org revolution/main Packages Ign http://source.repository.backtrack-linux.org revolution/microverse Packages Ign http://source.repository.backtrack-linux.org revolution/non-free Packages Ign http://source.repository.backtrack-linux.org revolution/testing Packages Ign http://source.repository.backtrack-linux.org revolution/main Packages Ign http://source.repository.backtrack-linux.org revolution/microverse Packages Ign http://source.repository.backtrack-linux.org revolution/non-free Packages Ign http://source.repository.backtrack-linux.org revolution/testing Packages Err http://source.repository.backtrack-linux.org revolution/main Packages Unable to connect to source.repository.backtrack-linux.org:http: Err http://source.repository.backtrack-linux.org revolution/microverse Packages Unable to connect to source.repository.backtrack-linux.org:http: Err http://source.repository.backtrack-linux.org revolution/non-free Packages Unable to connect to source.repository.backtrack-linux.org:http: Err http://source.repository.backtrack-linux.org revolution/testing Packages Unable to connect to source.repository.backtrack-linux.org:http: Fetched 2,772B in 1min 3s (44B/s) W: Failed to fetch http://all.repository.backtrack- \linux.org/dists/revolution/Release.gpg Could not connect to all.repository.backtrack-linux.org:80 (37.221.173.214). - connect (110: Connection timed out) W: Failed to fetch http://all.repository.backtrack-linux.org/dists/revolution/main/i18n/Translation-en_US.bz2 Unable to connect to all.repository.backtrack-linux.org:http: W: Failed to fetch http://all.repository.backtrack-linux.org/dists/revolution/microverse/i18n/Translation-en_US.bz2 Unable to connect to all.repository.backtrack-linux.org:http: W: Failed to fetch http://all.repository.backtrack-linux.org/dists/revolution/non-free/i18n/Translation-en_US.bz2 Unable to connect to all.repository.backtrack-linux.org:http: W: Failed to fetch http://all.repository.backtrack-linux.org/dists/revolution/testing/i18n/Translation-en_US.bz2 Unable to connect to all.repository.backtrack-linux.org:http: W: Failed to fetch http://32.repository.backtrack-linux.org/dists/revolution/Release.gpg Could not connect to 32.repository.backtrack-linux.org:80 (37.221.173.214). - connect (110: Connection timed out) W: Failed to fetch http://32.repository.backtrack-linux.org/dists/revolution/main/i18n/Translation-en_US.bz2 Unable to connect to 32.repository.backtrack-linux.org:http: W: Failed to fetch http://32.repository.backtrack-linux.org/dists/revolution/microverse/i18n/Translation-en_US.bz2 Unable to connect to 32.repository.backtrack-linux.org:http: W: Failed to fetch http://32.repository.backtrack-linux.org/dists/revolution/non-free/i18n/Translation-en_US.bz2 Unable to connect to 32.repository.backtrack-linux.org:http: W: Failed to fetch http://32.repository.backtrack-linux.org/dists/revolution/testing/i18n/Translation-en_US.bz2 Unable to connect to 32.repository.backtrack-linux.org:http: W: Failed to fetch http://source.repository.backtrack-linux.org/dists/revolution/Release.gpg Could not connect to source.repository.backtrack-linux.org:80 (37.221.173.214). - connect (110: Connection timed out) W: Failed to fetch http://source.repository.backtrack-linux.org/dists/revolution/main/i18n/Translation-en_US.bz2 Unable to connect to source.repository.backtrack-linux.org:http: W: Failed to fetch http://source.repository.backtrack-linux.org/dists/revolution/microverse/i18n/Translation-en_US.bz2 Unable to connect to source.repository.backtrack-linux.org:http: W: Failed to fetch http://source.repository.backtrack-linux.org/dists/revolution/non-free/i18n/Translation-en_US.bz2 Unable to connect to source.repository.backtrack-linux.org:http: W: Failed to fetch http://source.repository.backtrack-linux.org/dists/revolution/testing/i18n/Translation-en_US.bz2 Unable to connect to source.repository.backtrack-linux.org:http: W: Failed to fetch http://32.repository.backtrack-linux.org/dists/revolution/main/binary-i386/Packages.gz Unable to connect to 32.repository.backtrack-linux.org:http: W: Failed to fetch http://32.repository.backtrack-linux.org/dists/revolution/microverse/binary-i386/Packages.gz Unable to connect to 32.repository.backtrack-linux.org:http: W: Failed to fetch http://32.repository.backtrack-linux.org/dists/revolution/non-free/binary-i386/Packages.gz Unable to connect to 32.repository.backtrack-linux.org:http: W: Failed to fetch http://all.repository.backtrack-linux.org/dists/revolution/main/binary-i386/Packages.gz Unable to connect to all.repository.backtrack-linux.org:http: W: Failed to fetch http://all.repository.backtrack-linux.org/dists/revolution/microverse/binary-i386/Packages.gz Unable to connect to all.repository.backtrack-linux.org:http: W: Failed to fetch http://all.repository.backtrack-linux.org/dists/revolution/non-free/binary-i386/Packages.gz Unable to connect to all.repository.backtrack-linux.org:http: W: Failed to fetch http://all.repository.backtrack-linux.org/dists/revolution/testing/binary-i386/Packages.gz Unable to connect to all.repository.backtrack-linux.org:http: W: Failed to fetch http://32.repository.backtrack-linux.org/dists/revolution/testing/binary-i386/Packages.gz Unable to connect to 32.repository.backtrack-linux.org:http: W: Failed to fetch http://source.repository.backtrack-linux.org/dists/revolution/main/binary-i386/Packages.gz Unable to connect to source.repository.backtrack-linux.org:http: W: Failed to fetch http://source.repository.backtrack-linux.org/dists/revolution/microverse/binary-i386/Packages.gz Unable to connect to source.repository.backtrack-linux.org:http: W: Failed to fetch http://source.repository.backtrack-linux.org/dists/revolution/non-free/binary-i386/Packages.gz Unable to connect to source.repository.backtrack-linux.org:http: W: Failed to fetch http://source.repository.backtrack-linux.org/dists/revolution/testing/binary-i386/Packages.gz Unable to connect to source.repository.backtrack-linux.org:http: E: Some index files failed to download, they have been ignored, or old ones used instead. I Don't know how to get out of this ! I want to install RPM and YUM package on my backtrack ! I also searched over internet for answer . in backtrack forums or any other sites or weblogs i could'nt find a good answer ! can anyone help ??

    Read the article

  • Relationship between Repository and Unit of Work

    - by NullOrEmpty
    I am going to implement a repository, and I would like to use the UOW pattern since the consumer of the repository could do several operations, and I want to commit them at once. After read several articles about the matter, I still don't get how to relate this two elements, depending on the article it is being done in a way u other. Sometimes the UOW is something internal to the repository: public class Repository { UnitOfWork _uow; public Repository() { _uow = IoC.Get<UnitOfWork>(); } public void Save(Entity e) { _uow.Track(e); } public void SubmittChanges() { SaveInStorage(_uow.GetChanges()); } } And sometimes it is external: public class Repository { public void Save(Entity e, UnitOfWork uow) { uow.Track(e); } public void SubmittChanges(UnitOfWork uow) { SaveInStorage(uow.GetChanges()); } } Other times, is the UOW whom references the Repository public class UnitOfWork { Repository _repository; public UnitOfWork(Repository repository) { _repository = repository; } public void Save(Entity e) { this.Track(e); } public void SubmittChanges() { _repository.Save(this.GetChanges()); } } How are these two elements related? UOW tracks the elements that needs be changed, and repository contains the logic to persist those changes, but... who call who? Does the last make more sense? Also, who manages the connection? If several operations have to be done in the repository, I think using the same connection and even transaction is more sound, so maybe put the connection object inside the UOW and this one inside the repository makes sense as well. Cheers

    Read the article

  • The Best Websites for Free Online Courses, Certificates, Degrees, and Educational Resources

    - by Lori Kaufman
    Have you thought about expanding your knowledge by taking some courses? There are several colleges and other sites that offer free online courses, certificate programs, some degree programs, and education resources for teachers and professors. How to Banish Duplicate Photos with VisiPic How to Make Your Laptop Choose a Wired Connection Instead of Wireless HTG Explains: What Is Two-Factor Authentication and Should I Be Using It?

    Read the article

  • Converting mercurial repository to svn repository

    - by Jay
    I know you can convert svn repository to mercurial repository (or use mercurial as a client to svn repo) but what I want is to convert mercurial repository to svn repository. We have some tool that uses SVNKit, and we'd like to continue use it, but want to be able to work on mercurial repository. Hence we want to completely convert mercurial repo to svn repo. Is that something that's possible? (and how?)

    Read the article

  • How to use the unit of work and repository patterns in a service oriented enviroment

    - by A. Karimi
    I've created an application framework using the unit of work and repository patterns for it's data layer. Data consumer layers such as presentation depend on the data layer design. For example a CRUD abstract form has a dependency to a repository (IRepository). This architecture works like a charm in client/server environments (Ex. a WPF application and a SQL Server). But I'm looking for a good pattern to change or reuse this architecture for a service oriented environment. Of course I have some ideas: Idea 1: The "Adapter" design pattern Keep the current architecture and create a new unit of work and repository implementation which can work with a service instead of the ORM. Data layer consumers are loosely coupled to the data layer so it's possible but the problem is about the unit of work; I have to create a context which tracks the objects state at the client side and sends the changes to the server side on calling the "Commit" (Something that I think the RIA has done for Silverlight). Here the diagram: ----------- CLIENT----------- | ------------------ SERVER ---------------------- [ UI ] -> [ UoW/Repository ] ---> [ Web Services ] -> [ UoW/Repository ] -> [DB] Idea 2: Add another layer Add another layer (let say "local services" or "data provider"), then put it between the data layer (unit of work and repository) and the data consumer layers (like UI). Then I have to rewrite the consumer classes (CRUD and other classes which are dependent to IRepository) to depend on another interface. And the diagram: ----------------- CLIENT ------------------ | ------------------- SERVER --------------------- [ UI ] -> [ Local Services/Data Provider ] ---> [ Web Services ] -> [ UoW/Repository ] -> [DB] Please note that I have the local services layer on the current architecture but it doesn't expose the data layer functionality. In another word the UI layer can communicate with both of the data and local services layers whereas the local services layer also uses the data layer. | | | | | | | | ---> | Local Services | ---> | | | UI | | | | Data | | | | | | | ----------------------------> | |

    Read the article

  • Setting Up Git Repository on Remote Windows Server?

    - by Goober
    I have a windows server which I can access locally or remotely over the internet through remote desktop connection, etc. I want to set up a git repository (something similar to "trunk" in subversion), that can contain a series of repositories for multiple projects. Does anyone know how I go about doing this? I want to do it using a GUI if possible. I have followed this Git Bash Tutorial but it's very long winded and not exactly what I'm after. I'm using a Git client called MSYSGIT. Using this I just want to be able to set up remote repositories and start committing source code. Any help would be greatly appreciated!

    Read the article

  • Does the "security" repository provides anything not found in the "updates" repository?

    - by netvope
    For the limited number of package I looked at (e.g. apache), I found that the package version in the updates repository is always newer than or equal to the version available in the security repository (provided that they exist). This gives me the impression that all security patches posted to the security repository are also posted to the updates repository. If this is true, I can remove all <release_name>-security entries in my apt sources.list and the <release_name>-updates entries will still give me the security patches. This will speed up apt-get update quite a bit. The best documentation I can found regarding the repositories is on the community help page "Important Security Updates (raring-security)". Patches for security vulnerabilities in Ubuntu packages. They are managed by the Ubuntu Security Team and are designed to change the behavior of the package as little as possible -- in fact, the minimum required to resolve the security problem. As a result, they tend to be very low-risk to apply and all users are urged to apply security updates. "Recommended Updates (raring-updates)". Updates for serious bugs in Ubuntu packaging that do not affect the security of the system. However, it does not mention whether the updates repository also includes everything in the security repository. Can anyone confirm (or disconfirm) this?

    Read the article

  • Many small scripts, one repository or multiple?

    - by The Jug
    A co-worker and myself have run into an issue that we have multiple opinions on. Currently we have a git repository that we are keeping all of our cronjobs in. There are about 20 crons and they are not really related except for the fact that they are all small python scripts and essential for some activity. We are using a fabric.py file to deploy and a requirements.txt file to manage requirements for all of the scripts. Our issue is basically, do we keep all of these scripts in one git repository or should we be separating them out into their own repositories? By keeping them in one repository it is easier to deploy them onto one server. We can use just one cron file for all the scripts. However this feels wrong, as the 20 cronjobs are not logically related. Additionally, when using one requirements.txt file for all the scripts, it's hard to figure out what the dependencies are for a particular script and they all have to use the same versions of packages. We could separate all of the scripts out into their own repositories but this creates 20 different repositories that need to be remembered and dealt with. Most of these scripts are not very large and that solution seems to be overkill. A related question is, do we use one big crontab file for all cronjobs, or a separate file for each? If each has their own, how does one crontab's installation avoid overwriting the other 19? This also seems like a pain as there would then by 20 different cron files to keep track of. In short, our main question and issue is do we keep them all closely bundled as one repository or do we separate them out into their own repository with their own requirements.txt and fabfile.py? We feel like we're also probably looking over some really simple solution. Is there an easier way to deal with this issue?

    Read the article

  • Usage of repository between EF model and code consumer

    - by jim
    I have binary data in my database that I'll have to convert to bitmap at some point. I was thinking whether or not it's appropriate to use a repository and do it there. My consumer, which is a presentation layer, will use this repository. For example: // This is a class I created for modeling the item as is. public class RealItem { public string Name { get; set; } public Bitmap Image { get; set; } } public abstract class BaseRepository { //using Unity (http://unity.codeplex.com) to inject the dependancy of entity context. [Dependency] public Context { get; set; } } public calss ItemRepository : BaseRepository { public List<Items> Select() { IEnumerable<Items> items = from item in Context.Items select item; List<RealItem> lst = new List<RealItem>(); foreach(itm in items) { MemoryStream stream = new MemoryStream(itm.Image); Bitmap image = (Bitmap)Image.FromStream(stream); RealItem ritem = new RealItem{ Name=item.Name, Image=image }; lst.Add(ritem); } return lst; } } Is this a correct way to use the repository pattern? I'm learning this pattern and I've seen a lot of examples online that are using a repository but when I looked at their source code... for example: public IQueryable<object> Select { return from q in base.Context select q; } as you can see no behavior is added to the system by their approach, so I was confused that maybe repository is something else and I got it all wrong. At the end there should be extra benifits of using them right?

    Read the article

  • flickr, other account types not appearing in online-accounts

    - by Fen
    Using Shotwell, I discovered that to publish to Flickr I need to set up an online account. But the online-accounts system settings only has support for Google, Facebook, Windows Live, Microsoft Exchange and Enterprise Login (Kerberos). How do I add account types? These appear to be properly installed (dpkg-reconfigure returns silently): gnome-control-center-signon is already the newest version. account-plugin-yahoo is already the newest version. account-plugin-flickr is already the newest version. Here's the config file (I think): > cat /usr/share/applications/gnome-online-accounts-panel.desktop [Desktop Entry] Name=Online Accounts Comment=Manage online accounts Exec=gnome-control-center online-accounts Icon=goa-panel Terminal=false Type=Application StartupNotify=true Categories=GNOME;GTK;Settings;DesktopSettings;X-GNOME-Settings-Panel;X-GNOME-PersonalSettings; OnlyShowIn=GNOME;XFCE X-GNOME-Bugzilla-Bugzilla=GNOME X-GNOME-Bugzilla-Product=gnome-control-center X-GNOME-Bugzilla-Component=Online Accounts X-GNOME-Bugzilla-Version=3.4.2 X-GNOME-Settings-Panel=online-accounts # Translators: those are keywords for the online-accounts control-center panel Keywords=Google;Facebook;Flickr;Twitter;Yahoo;Web;Online;Chat;Calendar;Mail;Contact; X-Ubuntu-Gettext-Domain=gnome-control-center-2.0 History: Started out with Ubuntu (64-bit), then in 12.04 installed xubuntu-desktop and have been using that. Upgraded to 12.10.

    Read the article

  • Can't add repository due to 'missing' fingerprint

    - by cubsink
    I am trying to install nginx with php but when I am following a guide, like this one: http://www.justincarmony.com/blog/2011/10/24/setting-up-nginx-php-fpm-on-ubuntu-10-04/ I am always told to add that repository (sudo add-apt-repository ppa:brianmercer/php) but I can't. I'll just get "Error: can't find signing_key_fingerprint at https://launchpad.net/api/1.0/~nginx/+archive/php5" and when I go to that website I find that there is a fingerprint specified but still I get that error message. Is there anyway to specify it myself? And for the last thing, how I can fix this so I can continue my installation towards a working nginx enviroment with php. Thanks for your advice and better wisdom.

    Read the article

  • add-apt-repository not working UbuntuGnome 12.10

    - by nickcannariato
    When I try to add a ppa using the command: sudo add-apt-repository [insert ppa] the output I get is: Error in sitecustomize; set PYTHONVERBOSE for traceback: EOFError: EOF read where not expected Traceback (most recent call last): File "/usr/bin/add-apt-repository", line 3, in <module> from __future__ import print_function EOFError: EOF read where not expected This is the desktop version. It's a clean install and I didn't get any log errors on install. I haven't added or removed any python versions. Can someone set me straight on how to fix this?

    Read the article

  • c#Repository pattern: One repository per subclass?

    - by Alex
    I am wondering if you would create a repository for each subclass of a domain model. There are two classes for example: public class Person { public virtual String GivenName { set; get; } public virtual String FamilyName { set; get; } public virtual String EMailAdress { set; get; } } public class Customer : Person { public virtual DateTime RegistrationDate { get; set; } public virtual String Password { get; set; } } Would you create both a PersonRepository and a CustomerRepository or just the PersonRepository which would also be able to execute Customer related queries?

    Read the article

  • apt-get doesn't see packages in my trivial repository

    - by lorin
    I've tried to set up a trivial repository with binary .debs for internal use, but apt-get doesn't see the packages. I've done the following: On the web server: Created the binary debs with dpkg-buildpackage Put all of the binary debs in a web-accessible directory which corresponds to http://www.example.com/packages Generated a Packages.gz file in the same directory by doing: dpkg-scansources . /dev/null | gzip -9c > Packages.gz On the client machine: Added the following line to my /etc/apt/sources.list file: deb http://www.example.com/packages / Ran: sudo apt-get update The output related to my trivial repository looked like this: Ign http://www.example.com Release.gpg Ign http://www.example.com/packages/ Translation-en_US Ign http://www.example.com Release Ign http://www.example.com Packages Ign http://www.example.com Packages Hit http://www.example.com Packages But I can't install the package by name. For example, there's a package called "python-nova" which corresponds to package python-nova_2011.3-custom~bzr680-0ubuntu1_all.deb I've tried to do: apt-get install python-nova, but I get the following error: $ sudo apt-get install python-nova Reading package lists... Done Building dependency tree Reading state information... Done E: Couldn't find package python-nova

    Read the article

  • How To: LIC of India Online Policy Payments And Status Enquiries

    - by Kavitha
    Life Insurance Corporation (LIC) of India is the largest state-owned insurance company in India and also the country’s largest investor. The premium  amount for the insurance policies purchased from LIC are paid by visiting the nearest LIC office or by taking help of LIC agents. It’s a time consuming process and most of us are fed up of standing in long queues at LIC offices for paying premium amount. LIC Online Services Website The worries are not any more, no need to stand in a long queue or approach an agent for paying your LIC policies. LIC of India has an online payment and also renewal facility : http://licindia.in. To pay the policies online we have to register with LIC and login to the site using the registered username and password. Once you login, you can enter your profile information and LIC policies that are purchased on your name(register the policies that are purchased  only on your name, otherwise you land in to troubles). Once registered, managing activities of like payments, loan eligibility checking, policy maturity, etc. are very easy. For online payment of policies you can find Pay Premium Online tab which when clicked takes you to a page that lists all the policies that are due. Payments can be made using credit/debit cards and online banking systems. Almost all the Indian banks are covered as part of the online payment system. Other services that are available through the online system of LIC are : View ULIP Policies,Premium Calendar, Calculate Loan Eligibility, Revival Quote, Policy Maturity, Address Change Requests, etc. LIC Policy Status Enquiry Through Phone LIC also has a helpline/customer care  number ‘1251‘. You can call 1251 to know about  your policy status, premium due date, Loan possibility and loan amount possible, time of maturity etc. This article titled,How To: LIC of India Online Policy Payments And Status Enquiries, was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • NRF Online Merchandising Workshop: Where Online Retailers Are Focusing for Holiday and Beyond

    - by Rose Spicer-Oracle
    0 0 1 1204 6863 Oracle Corporation 57 16 8051 14.0 Normal 0 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:12.0pt; font-family:Cambria; mso-ascii-font-family:Cambria; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Cambria; mso-hansi-theme-font:minor-latin;} Last month we attended the NRF Online Merchandising Workshop in LA, and it was a great opportunity to catch up with our customers, meet new retailers, and hear some great presentations from VF Corporation, Zazzle, Julep Beauty, Backcountry, eBags and more. The one-on-one conversations with Merchants and the keynote presentations carry the same themes across companies of all sizes and across verticals. With only 125 days left (and counting) until Black Friday, these conversations provided some great insight in to what’s top of mind for retailers during the most stressful time of their year, and a sneak peek in to what they will deliver this holiday season.  Some of the most popular topics were: When to start promoting for holiday: seems like a funny conversation to have in July, but a number of retailers said they already had their holiday shopping gift guides live on their site, and it was attracting a significant portion of their onsite traffic. When it comes to timing, most retailers were questioning when to begin their holiday promotions -- carefully balancing when to release pricing and specials, and knowing that customers are holding out for last-minute deals and price drops. Many retailers noted the frustrations around transparent pricing by Amazon and a few other mega-retailers last year, publishing their “lowest prices of the season” as early as October – ensuring shoppers that those prices were the best they could get all season long. Many retailers felt their hands were forced to drop prices. Others kept their set pricing with negative customer reaction, causing some to miss their holiday goals. The pressure is on, and most retailers identified November 1 as their target start date for the holiday promotions blitz. Some are even waiting for the big guys to release their “lowest prices of the season” guides and will then follow suit.      Attribution is tough – and a huge focus: understanding the path to conversion is a tough nut to crack, especially in the new omnichannel world where consumers use multiple touchpoints to make a single purchase, and internal management wants to know hard data. This has lead many retailers to invest in attribution; carefully tracking their online marketing efforts to determine what gets “credit” for the sale, instead of giving credit to the “last click.” Retailers noted that it is very difficult to determine the numbers when online and offline worlds collide – like when a shopper uses digital channels for research and then makes a purchase in a store. As one of the presenters from The North Face mentioned in her keynote, a key to enabling better customer service and satisfaction when it comes to converged online and offline sales is training the in-store staff, and creating a culture where it eventually “doesn’t matter what group gets the credit” if they all add to the sale. No doubt, the area of attribution will be a big area of retail investment in the coming years.      How to plan for the converged world: planning to ensure inventory gets where it needs to be was another concern. In conversations with retailers, we advised them to analyze customer patterns: where shoppers purchase items, where the items were sourced from and even where items are returned. This analysis is very valuable in determining inventory plans. From there, retailers can more accurately plan and allocate inventory to support both the online and offline customer behavior. As we head into the holiday season, the need for accurate enterprise-wide inventory visibility, and providing that information to associates, is even more critical to the brand-wide customer experience.       Improving the search / navigation / usability of the site(s): Aside from some of the big ideas and standard holiday pricing pressure, most conversations we had centered around continuing to improve the basics of the site. Reinvesting in search and navigation came up time and time again (FitForCommerce blogged about what a big topic it was at the event as well). Obviously getting shoppers on their path quickly and allowing them to find what they need fast is critical, but it was definitely interesting to hear just how much effort is still going in to honing the search and navigation experience. Adding new elements to search and navigation like typeahed, inventive navigation refinements, and new navigation categories like gift guides, specialized boutiques and flash sales were top of mind, in addition to searchandising and making search-driven product recommendations. (Oracle can help!)       Reducing cart abandonment: always a hot topic that is top of mind for every online retailer. Getting shoppers to the cart is often less then half the battle; getting them to click “buy” and complete the transaction is much more difficult. While retailers carefully study the checkout process and where shoppers tend to bounce, they know that how they design their checkout page is critical. We’re all online shoppers in our personal lives and we know how frustrating it can be when total prices are not transparent (i.e. shipping, processing, taxes is not included until the very last possible screen before clicking that buy button). Online retailers are struggling with where in the checkout process to surface the total price to be charged to reduce cart abandonment, while not showing the total figure too early in the process that it keeps shoppers from getting to checkout altogether. Recent research shows that providing total pricing prior to the checkout process dramatically reduces cart abandonment – as it serves as a filter to those shopping within a specific price band. Much of the cart abandonment discussion leads us to…       The free shipping / free returns question: it’s no secret that because of Amazon and programs like Prime, consumers expect free shipping, much to the chagrin of the smaller retailer. The reality is that if you’re not a mega-retailer, shipping is an expensive part of doing business that doesn’t allow most retailers to keep their prices low and offer free shipping. This has many retailers venturing out on the “free returns” path, especially in apparel. A number of retailers we spoke with are testing a flat rate shipping fee with free returns to see if they can crack the price threshold where shoppers are willing to pay for shipping with an added service. But, free shipping remains king.      Social ads and retargeting: they are working, but do they turn off consumers? That’s the big question. Every retailer we spoke with during a roundtable on the topic said that social ads and retargeting (where that pair of boots you’re been eyeing on a site magically follows you around the Internet) work and are meeting campaign goals. The larger question many retailers are asking is if this type of tactic is turning off a large number of shoppers, even if these campaigns are meeting their early goals. Retailers also mentioned that Facebook ads are working very well for them, especially when it comes to new customer acquisition, serving as a complimentary a channel to SEO when it comes to engaging new customers. While there are always new things to experiment with in retail, standard challenges are top of mind as retailers scramble to get ready for holiday. It will undoubtedly be another record-breaking online shopping season, but as retailers get more and more advanced with each Black Friday, expect some exciting things. This excitement needs to be backed by sound solutions and optimized operations. Then again, consumers are expecting more than ever, so I don’t doubt that retailers are already thinking about the possibilities of holiday 2015… and beyond. Customers who read this article, also found value in the following stories: Personalization for Retail: http://blogs.oracle.com/retail/entry/personalization_for_retailShop Direct User Experience Focus Drives Sales:https://blogs.oracle.com/retail/entry/shop_direct_user_experience_focusMaking Waves: Australian Online Retailer SurfStitch: https://blogs.oracle.com/oracleretail/entry/surf_stitchWhat’s new in Oracle Commerce v11.1 for RetailWhat the Content+Commerce Equation is Missing

    Read the article

  • Update Manager Not working Fail to Download Repository Information

    - by user51564
    When I try to update, I get this error: Failed to Download your repository information Check your internet connection W:GPG error: http ://ppa.launchpad.net oneiric Release: The following signatures couldn't be verified because the public key is not available: NO_PUBKEY 4874D3686E80C6B7, W:Failed to fetch http://ppa.launchpad.net/pmcenery/ppa/ubuntu/dists/oneiric/main/source/Sources 404 Not Found W:Failed to fetch http://ppa.launchpad.net/pmcenery/ppa/ubuntu/dists/oneiric/main/binary-i386/Packages 404 Not Found E:Some index files failed to download. They have been ignored, or old ones used instead.

    Read the article

  • update manager is not woring Failed to download repository information

    - by harry
    My update manager is not working and showing this message Failed to download repository information Check your Internet connection. W:Failed to fetch http://ppa.launchpad.net/pmcenery/ppa/ubuntu/dists/precise/main/source/Sources 404 Not Found , W:Failed to fetch http://ppa.launchpad.net/pmcenery/ppa/ubuntu/dists/precise/main/binary-i386/Packages 404 Not Found , E:Some index files failed to download. They have been ignored, or old ones used instead. Pleas help Thank you for your time.

    Read the article

  • Ops Center Solaris 11 IPS Repository Management: Using ISO Images

    - by S Stelting
    Please join us for a live WebEx presentation of this topic on Tuesday, November 20th at 9am MDT. Details for the call are provided below: https://oracleconferencing.webex.com/oracleconferencing/j.php?ED=209834017&UID=1512096072&PW=NYTVlZTYxMzdm&RT=MiMxMQ%3D%3D Meeting password: oracle123 Call-in toll-free number: 1-866-682-4770 International numbers: http://www.intercall.com/oracle/access_numbers.htm Conference Code: 762 9343 # Security Code: 7777 # With Enterprise Manager Ops Center 12c, you can provision, patch, monitor and manage Oracle Solaris 11 instances. To do this, Ops Center creates and maintains a Solaris 11 Image Packaging System (IPS) repository on the Enterprise Controller. During the Enterprise Controller configuration, you can load repository content directly from Oracle's Support Web site and subsequently synchronize the repository as new content becomes available. Of course, you can also use Solaris 11 ISO images to create and update your Ops Center repository. There are a few excellent reasons for doing this: You're running Ops Center in disconnected mode, and don't have Internet access on your Enterprise Controller You'd rather avoid the bandwidth associated with live synchronization of a Solaris 11 package repository This demo will show you how to use Solaris 11 ISO images to set up and update your Ops Center repository. Prerequisites This tip assumes that you've already installed the Enterprise Controller on a Solaris 11 OS instance and that you're ready for post-install configuration. In addition, there are specific Ops Center and OS version requirements depending on which version of Solaris 11 you plan to install.You can get full details about the requirements in the Release Notes for Ops Center 12c update 2. Additional information is available in the Ops Center update 2 Readme document. Part 1: Using a Solaris 11 ISO Image to Create an Ops Center Repository Step 1 – Download the Solaris 11 Repository Image The Oracle Web site provides a number of download links for official Solaris 11 images. Among those links is a two-part downloadable repository image, which provides repository content for Solaris 11 SPARC and X86 architectures. In this case, I used the Solaris 11 11/11 image. First, navigate to the Oracle Web site and accept the OTN License agreement: http://www.oracle.com/technetwork/server-storage/solaris11/downloads/index.html Next, download both parts of the Solaris 11 repository image. I recommend using the Solaris 11 11/11 image, and have provided the URLs here: http://download.oracle.com/otn/solaris/11/sol-11-1111-repo-full.iso-ahttp://download.oracle.com/otn/solaris/11/sol-11-1111-repo-full.iso-b Finally, use the cat command to generate an ISO image you can use to create your repository: # cat sol-11-1111-repo-full.iso-a sol-11-1111-repo-full.iso-b > sol-11-1111-repo-full.iso The process is very similar if you plan to set up a Solaris 11.1 release in Ops Center. In that case, navigate to the Solaris 11 download page, accept the license agreement and download both parts of the Solaris 11.1 repository image. Use the cat command to create a single ISO image for Solaris 11.1 Step 2 – Mount the Solaris 11 ISO Image in your Local Filesystem Once you have created the Solaris 11 ISO file, use the mount command to attach it to your local filesystem. After the image has been mounted, you can browse the repository from the ./repo subdirectory, and use the pkgrepo command to verify that Solaris 11 recognizes the content: Step 3 – Use the Image to Create your Ops Center Repository When you have confirmed the repository is available, you can use the image to create the Enterprise Controller repository. The operation will be slightly different depending on whether you configure Ops Center for Connected or Disconnected Mode operation.For connected mode operation, specify the mounted ./repo directory in step 4.1 of the configuration wizard, replacing the default Web-based URL. Since you're synchronizing from an OS repository image, you don't need to specify a key or certificate for the operation. For disconnected mode configuration, specify the Solaris 11 directory along with the path to the disconnected mode bundle downloaded by running the Ops Center harvester script: Ops Center will run a job to import package content from the mounted ISO image. A synchronization job can take several hours to run – in my case, the job ran for 3 hours, 22 minutes on a SunFire X4200 M2 server. During the job, Ops Center performs three important tasks: Synchronizes all content from the image and refreshes the repository Updates the IPS publisher information Creates OS Provisioning profiles and policies based on the content When the job is complete, you can unmount the ISO image from your Enterprise Controller. At that time, you can view the repository contents in your Ops Center Solaris 11 library. For the Solaris 11 11/11 release, you should see 8,668 packages and patches in the contents. You should also see default deployment plans for Solaris 11 provisioning. As part of the repository import, Ops Center generates plans and profiles for desktop, small and large servers for the SPARC and X86 architecture. Part 2: Using a Solaris 11 SRU to update an Ops Center Repository It's possible to use the same approach to upgrade your Ops Center repository to a Solaris 11 Support Repository Update, or SRU. Each SRU provides packages and updates to Solaris 11 - for example, SRU 8.5 provided the packaged for Oracle VM Server for SPARC 2.2 SRUs are available for download as ISO images from My Oracle Support, under document ID 1372094.1. The document provides download links for all SRUs which have been released by Oracle for Solaris 11. SRUs are cumulative, so later versions include the packages from earlier SRUs. After downloading an ISO image for an SRU, you can mount it to your local filesystem using a mount command similar to the one shown for Solaris 11 11/11. When the ISO image is mounted to the file system, you can perform the Add Content action from the Solaris 11 Library to synchronize packages and patches from the mounted image. I used the same mount point, so the repository URL was file://mnt/repo once again: After the synchronization of an SRU is complete, you can verify its content in the Solaris 11 library using the search function. The version pattern is 0.175.0.#, where the # is the same value as the SRU. In this example, I upgraded to SRU 1. The update job ran in just under 8 minutes, and a quick search shows that 22 software components were added to the repository: It's also possible to search for "Support Repository Update" to confirm the SRU was successfully added to the repository. Details on any of the update content are available by clicking the "View Details" button under the Packages/Patches entry.

    Read the article

  • Mock Repository vs. Real Repository w/Mocked Data

    - by n8wrl
    I must be doing something fundamentally wrong. I am implmenting my repositories and then testing them with mocked data. All is well. Now I want to test my domain objects so I point them at mock repositories. But I'm finding that I have to re-implement logic from the 'real' repositories into the mocks, or, create 'helper classes' that encapsulate the logic and interact with the repositories (real or mock), and then I have to test those too. So what am I missing - why implement and test mock repositories when I could use the real ones with mocked data? EDIT: To clarify, by 'mocked data' I do not hit the actual database. I have a 'DB mock layer' I can insert under the real repositories that returns known-data.

    Read the article

  • AIA Artefakte im Oracle Enterprise Repository

    - by Hans Viehmann
    Das Oracle Enterprise Repository (OER) ist die zentrale Stelle zur Verwaltung von SOA Artefakten aller Art, mit dem Ziel, den gesamten Lebenszyklus dieser Artefakte zu begleiten. Es ist wesentliche Grundlage für deren Wiederverwendung, für die Ermittlung von Abhängigkeiten, wie auch für die Bestimmung des Wertes dieser Artefakte, was wiederum für den Nutzen der SOA Implementierung von Bedeutung ist. In AIA 11g wird die aktuelle Version des OER unterstützt und wird zusätzlich ergänzt durch die Project Lifecycle Workbench, in der die funktionale Spezifikation, die Aufteilung der Prozesse, oder beispielsweise die Generierung des Deployment Plans erfolgt.Für die Bereitstellung der Artefakte des Foundation Pack 11g gibt es inzwischen ein zugehöriges AIA Solution Pack für OER, mit dem die entsprechenden Strukturen, sowie die Bestandteile des Foundation Packs 11g, also EBOs, EBMs, EBSs, usw. unabhängig von einer AIA Installation direkt importiert werden können. Das Pack steht auch auf support.oracle.com bereit und kann hier heruntergeladen werden.

    Read the article

  • Solution with multiple projects and (GitHub) single issue tracker and repository

    - by Luiz Damim
    I have a Visual Studio solution with multiple projects: Acme.Core Acme.Core.Tests Acme.UI.MvcSite1 Acme.UI.MvcSite2 Acme.UI.WinformsApp1 Acme.UI.WinformsApp2 ... The entire solution is checked-in in a single GitHub (private) repo. Acme.Core contains our business logic and all UI projects are deployables. UI projects have different requirements and features, but some of them are implemented in more than one project. All issues are opened in a single issue tracker and classified using labels ([MvcSite1], [WinformsApp1], etc) but I'm thinking it's starting to get messy. Is it ok to use a single repository and issue tracker to track multiple projects in one solution?

    Read the article

1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >