Search Results

Search found 11078 results on 444 pages for 'virtual inheritance'.

Page 11/444 | < Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >

  • Virtual PC on Windows 7 - Hardware-assisted virtualization is disabled

    - by DLux
    I am running a Lenovo Thinkpad T61 with an Intel Core 2 Duo T7300 processor. When I run Virtual PC in Windows 7, I get the following error: Unable to start Windows Virtual PC because hardware-assisted virtualization is disabled. When running the Hardware-Assisted Virtualization Detection Tool from Microsoft says: Hardware-assited virtualization is not enabled on this computer. Now, in the BIOS, I do have virtualization enabled and according to Intel this processor supports Intel-VT. What am I missing here?

    Read the article

  • IIS6 Virtual Directory 500 Error on Remote Share

    - by David
    We have our servers at the server farm in a domain. Let's call it LIVE. Our developer computers live in a completely separate corporate domain, miles and miles away. Let's call it CORP. We have a large central storage unit (unix) that houses images and other media needed by many webservers in the server farm. The IIS application pools run as (let's say) LIVE\MediaUser and use those credentials to connect to a central storage share as a virtual directory, retrieve the images, and serve them as if they were local on each server. The problem is in development. On my development machine. I log in as CORP\MyName. My IIS 6 application pool runs as Network Service. I can't run it as a user from the LIVE domain because my machine isn't (and can not be) joined to that domain. I try to create a virtual directory, point it to the same network directory, click Connect As, uncheck the "Always use the authenticated user's credentials when validating access to the network directory" checkbox so that I can enter the login info, enter the credentails for LIVE\MediaUser, click OK, verify the password, etc. This doesn't work. I get "HTTP Error 500 - Internal server error" from IIS. The IIS log file reports sc-status = 500, sc-substatus = 16, and sc-win32-status = 1326. The documentation says this means "UNC authorization credentials are incorrect" and the Win32 status means "Logon failure: unknown user name or bad password." This would be all and good if it were anywhere close to accurate. I double- and trouble-checked it. Tried multiple known good logins. The IIS manager allows me to view the file tree in its window, it's only the browser that kicks me out. I even tried going to the virtual directory's Directory Security tab, and under Authentication and Access Control, I tried using the same LIVE domain username for the anonymous access credential. No luck. I'm not trying to run any ASP, ASP.NET, or other dynamic anything out of the virtual directory. I just want IIS to be able to load static images, css, and js files. If anyone has some bright ideas I would be most appreciative!

    Read the article

  • virtual appliance resources

    - by user11457
    I'm fairly new to appliances but have become a big fan of virtualization. Are there any other resources where I can find and download virtual appliances outside of vmware's website? Or do I just need to scour the web looking them? I'm sure there is a central site that lists all the virtual appliances available, if there isn't I'm going to write one.

    Read the article

  • How to activate Virtual Desktop on Fortigate 100A ?

    - by Deniz
    We did recently update the firmware on our Fortigate 100A box and after the upgrade we tried to use the "Virtual Desktop" feature. (This isn't a new firmware feature) We can't find a way to activate or use it. Does anyone have any experience on "Virtual Desktop" of Fortigate devices ?

    Read the article

  • IIS6 Virtual Directory 500 Error on Remote Share

    - by David Boike
    We have our servers at the server farm in a domain. Let's call it LIVE. Our developer computers live in a completely separate corporate domain, miles and miles away. Let's call it CORP. We have a large central storage unit (unix) that houses images and other media needed by many webservers in the server farm. The IIS application pools run as (let's say) LIVE\MediaUser and use those credentials to connect to a central storage share as a virtual directory, retrieve the images, and serve them as if they were local on each server. The problem is in development. On my development machine. I log in as CORP\MyName. My IIS 6 application pool runs as Network Service. I can't run it as a user from the LIVE domain because my machine isn't (and can not be) joined to that domain. I try to create a virtual directory, point it to the same network directory, click Connect As, uncheck the "Always use the authenticated user's credentials when validating access to the network directory" checkbox so that I can enter the login info, enter the credentails for LIVE\MediaUser, click OK, verify the password, etc. This doesn't work. I get "HTTP Error 500 - Internal server error" from IIS. The IIS log file reports sc-status = 500, sc-substatus = 16, and sc-win32-status = 1326. The documentation says this means "UNC authorization credentials are incorrect" and the Win32 status means "Logon failure: unknown user name or bad password." This would be all and good if it were anywhere close to accurate. I double- and trouble-checked it. Tried multiple known good logins. The IIS manager allows me to view the file tree in its window, it's only the browser that kicks me out. I even tried going to the virtual directory's Directory Security tab, and under Authentication and Access Control, I tried using the same LIVE domain username for the anonymous access credential. No luck. I'm not trying to run any ASP, ASP.NET, or other dynamic anything out of the virtual directory. I just want IIS to be able to load static images, css, and js files. If anyone has some bright ideas I would be most appreciative!

    Read the article

  • Can GPU capabilities impact virtual machine performance?

    - by Dave White
    While this many not seem like a programming question directly, it impacts my development activities and so it seems like it belongs here. It seems that more and more developers are turning to virtual environments for development activities on their computers, SharePoint development being a prime example. Also, as a trainer, I have virtual training environments for all of the classes that I teach. I recently purchased a new Dell E6510 to travel around with. It has the i7 620M (Dual core, HyperThreaded cpu running at 2.66GHz) and 8 GB of memory. Reading the spec sheet, it sounded like it would be a great laptop to carry around and run virtual machines on. Getting the laptop though, I've been pretty disappointed with the user experience of developing in a virtual machine. Giving the Virtual Machine 4 GB of memory, it was slow and I could type complete sentences and watch the VM "catchup". My company has training laptops that we provide for our classes. They are Dell Precision M6400 Intel Core 2 Duo P8700 running at 2.54Ghz with 8 GB of memory and the experience on this laptops is night and day compared to the E6510. They are crisp and you barely aware that you are running in a virtual environment. Since the E6510 should be faster in all categories than the M6400, I couldn't understand why the new laptop was slower, so I did a component by component comparison and the only place where the E6510 is less performant than the M6400 is the graphics department. The M6400 is running a nVidia FX 2700m GPU and the E6510 is running a nVidia 3100M GPU. Looking at benchmarks of the two GPUs suggest that the FX 2700M is twice as fast as the 3100M. http://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html 3100M = 111th (E6510) FX 2700m = 47th (Precision M6400) Radeon HD 5870 = 8th (Alienware) The host OS is Windows 7 64bit as is the guest OS, running in Virtual Box 3.1.8 with Guest Additions installed on the guest. The IDE being used in the virtual environment is VS 2010 Premium. So after that long setup, my question is: Is the GPU significantly impacting the virtual machine's performance or are there other factors that I'm not looking at that I can use to boost the vm's performance? Do we now have to consider GPU performance when purchasing laptops where we expect to use virtualized development environments? Thanks in advance. Cheers, Dave

    Read the article

  • Virtual Function Implementation

    - by Gokul
    Hi, I have kept hearing this statement. Switch..Case is Evil for code maintenance, but it provides better performance(since compiler can inline stuffs etc..). Virtual functions are very good for code maintenance, but they incur a performance penalty of two pointer indirections. Say i have a base class with 2 subclasses(X and Y) and one virtual function, so there will be two virtual tables. The object has a pointer, based on which it will choose a virtual table. So for the compiler, it is more like switch( object's function ptr ) { case 0x....: X->call(); break; case 0x....: Y->call(); }; So why should virtual function cost more, if it can get implemented this way, as the compiler can do the same in-lining and other stuff here. Or explain me, why is it decided not to implement the virtual function execution in this way? Thanks, Gokul.

    Read the article

  • Two dimensional virtual desktop space in Ubuntu

    - by Herms
    Is there any way to create a 2-dimensional virtual desktop space in Ubuntu? The only control I'm seeing is the number of virtual desktops, but they seem to only go in a line. I'm used to having a 2-dimensional space (so I can go up/down/left/right instead of just left/right), and I'd really like to have that in ubuntu as well.

    Read the article

  • Resource Pool sharing in Hyper-V Virtual Machines

    - by user67905
    I understand that we can install Hyper-V on one server and run a number of Virtual Machines on it, upto the limit of resources of that server. I want to know if it is possible to install Hyper-V lumped on two or more servers, so that the Virtual Machines can use the underlying resources pool of both the servers? And also if that same is possible for an “n” number of servers, instead of just 2 servers.

    Read the article

  • In a Virtual Machine, is the Virtual Processor using RAM or part of the Processor?

    - by Jason H
    I am running a 15" MacBook Pro (2.66GHz) and 4GB of RAM. I am considering downgrading to a 13 MacBook Pro (2.4GHz) with 8GB of RAM. Most of what I do at work is through Windows and I need to run it virtually. So my real question is when running a virtual machine will the virtual processor be utilizing RAM or part of the hosts processor? My assumption is that it will utilize the allocated RAM but I have seen zero documentation to support that.

    Read the article

  • Does Qt support virtual pure slots ?

    - by ereOn
    Hi, My GUI project in Qt has a lot of "configuration pages" classes which all inherit directly from QWidget. Recently, I realized that all these classes share 2 commons slots (loadSettings() and saveSettings()). Regarding this, I have two questions: Does it make sense to write a intermediate base abstract class (lets name it BaseConfigurationPage) with these two slots as virtual pure methods ? (Every possible configuration page will always have these two methods, so I would say "yes") Before I do the heavy change in my code (if I have to) : does Qt support virtual pure slots ? Is there anything I should be aware of ? Here is a code example describing everything: class BaseConfigurationPage : public QWidget { // Some constructor and other methods, irrelevant here. public slots: virtual void loadSettings() = 0; virtual void saveSettings() = 0; }; class GeneralConfigurationPage : public BaseConfigurationPage { // Some constructor and other methods, irrelevant here. public slots: void loadSettings(); void saveSettings(); };

    Read the article

  • Virtual session on windows xp

    - by dotnet-practitioner
    What is the easiest way to install , setup, and run virtual session on my fresh install on my windows xp computer? I want to be able to browse , install a new software in a new virtual session instead of machine itself. What is available out there? What kind of software it would take and are there any free solutions out there? Easiest solution would be very helpful for me.

    Read the article

  • Default class for SQLAlchemy single table inheritance

    - by eclaird
    I've set up a single table inheritance, but I need a "default" class to use when an unknown polymorphic identity is encountered. The database is not in my control and so the data can be pretty much anything. A working example setup: import sqlalchemy as sa from sqlalchemy import orm engine = sa.create_engine('sqlite://') metadata = sa.MetaData(bind=engine) table = sa.Table('example_types', metadata, sa.Column('id', sa.Integer, primary_key=True), sa.Column('type', sa.Integer), ) metadata.create_all() class BaseType(object): pass class TypeA(BaseType): pass class TypeB(BaseType): pass base_mapper = orm.mapper(BaseType, table, polymorphic_on=table.c.type, polymorphic_identity=None, ) orm.mapper(TypeA, inherits=base_mapper, polymorphic_identity='A', ) orm.mapper(TypeB, inherits=base_mapper, polymorphic_identity='B', ) Session = orm.sessionmaker(autocommit=False, autoflush=False) session = Session() Now, if I insert a new unmapped identity... engine.execute('INSERT INTO EXAMPLE_TYPES (TYPE) VALUES (\'C\')') session.query(BaseType).first() ...things break. Traceback (most recent call last): File "<stdin>", line 1, in <module> File ".../SQLAlchemy-0.6.5-py2.6.egg/sqlalchemy/orm/query.py", line 1619, in first ret = list(self[0:1]) File ".../SQLAlchemy-0.6.5-py2.6.egg/sqlalchemy/orm/query.py", line 1528, in __getitem__ return list(res) File ".../SQLAlchemy-0.6.5-py2.6.egg/sqlalchemy/orm/query.py", line 1797, in instances rows = [process[0](row, None) for row in fetch] File ".../SQLAlchemy-0.6.5-py2.6.egg/sqlalchemy/orm/mapper.py", line 2179, in _instance _instance = polymorphic_instances[discriminator] File ".../SQLAlchemy-0.6.5-py2.6.egg/sqlalchemy/util.py", line 83, in __missing__ self[key] = val = self.creator(key) File ".../SQLAlchemy-0.6.5-py2.6.egg/sqlalchemy/orm/mapper.py", line 2341, in configure_subclass_mapper discriminator) AssertionError: No such polymorphic_identity u'C' is defined What I expected: >>> result = session.query(BaseType).first() >>> result <BaseType object at 0x1c8db70> >>> result.type u'C' I think this used to work with some older version of SQLAlchemy, but I haven't been keeping up with the development lately. Any pointers on how to accomplish this?

    Read the article

  • Inheritance mapping with Fluent NHibernate

    - by Berryl
    Below is an example of how I currently use automapping overrides to set up a my db representation of inheritance. It gets the job done functionality wise BUT by using some internal default values. For example, the discriminator column name winds up being the literal value 'discriminator' instead of "ActivityType, and the discriminator values are the fully qualified type of each class, instead of "ACCOUNT" and "PROJECT". I am guessing that this is a bug that doesn't get much attention now that conventions are preferred, and that the convention approach works correctly. I am looking for a sample of usage. Cheers, Berryl public class ActivityBaseMap : IAutoMappingOverride<ActivityBase> { public void Override(AutoMapping<ActivityBase> mapping) { ... mapping.DiscriminateSubClassesOnColumn("ActivityType"); } } public class AccountingActivityMap : SubclassMap<AccountingActivity> { public AccountingActivityMap() { ... DiscriminatorValue("ACCOUNT"); } } public class ProjectActivityMap : SubclassMap<ProjectActivity> { public ProjectActivityMap() { ... DiscriminatorValue("PROJECT"); } }

    Read the article

  • Objective-C Simple Inheritance and OO Principles

    - by bleeckerj
    I have a subclass SubClass that inherits from baseclass BaseClass. BaseClass has an initializer, like so: -(id)init { self = [super init]; if(self) { [self commonInit]; } return self; } -(void)commonInit { self.goodStuff = [[NSMutableArray alloc]init]; } SubClass does its initializer, like so: -(id)init { self = [super init]; if(self) { [self commonInit]; } return self; } -(void)commonInit { self.extraGoodStuff = [[NSMutableArray alloc]init]; } Now, I've *never taken a proper Objective-C course, but I'm a programmer more from the Electrical Engineering side, so I make do. I've developed server-side applications mostly in Java though, so I may be seeing the OO world through Java principles. When SubClass is initialized, it calls the BaseClass init and my expectation would be — because inheritance to me implies that characteristics of a BaseClass pass through to SubClass — that the commonInit method in BaseClass would be called during BaseClass init. It is not. I can *sorta understand maybe-possibly-stretch-my-imagination why it wouldn't be. But, then — why wouldn't it be based on the principles of OOP? What does "self" represent if not the instance of the class of the running code? Okay, so — I'm not going to argue that what a well-developed edition of Objective-C is doing is wrong. So, then — what is the pattern I should be using in this case? I want SubClass to have two main bits — the goodStuff that BaseClass has as well as the extraGoodStuff that it deserves as well. Clearly, I've been using the wrong pattern in this type of situation. Am I meant to expose commonInit (which makes me wonder about encapsulation principles — why expose something that, in the Java world at least, would be considered "protected" and something that should only ever be called once for each instance)? I've run into a similar problem in the recent past and tried to muddle through it, but now — I'm really wondering if I've got my principles and concepts all straight in my head. Little help, please.

    Read the article

  • Javascript: prototypeal inheritance and the prototype proprity

    - by JanD
    Hi, I have a simple code fragment in JS working with prototype inheritance. function object(o) { function F() {} F.prototype = o; return new F(); } //the following code block has a alternate version var mammal={ color: "brown", getColor: function(){ return this.color; } } var myCat = object(mammal); myCat.meow = function(){return "meow";} that worked fine but adding this: mammal.prototype.kindOf = "predator"; does not. ("mammal.prototype is undefined") Since I guessed that object maybe have no prototype I rewrote it, replacing the var mammal={... block with: function mammal(){ this.color="brown"; this.getColor = function(){return this.color;} } which gave me a bunch of other errors: "Function.prototype.toString called on incompatible object" and if I try to call _myCat.getColor() "myCat.getColor is not a function" Now I am totally confused. After reading Crockford, and Flanagan I did not get the solution for the errors. So it would be great if somebody knows... - why is the prototype undefined in the first example (which is foremost concern; I thought the prototype of explicitly set in the object() function) - why get I these strange errors trying to use the mammal function as prototype object in the object() function?

    Read the article

  • Javascript: prototypal inheritance and the prototype property

    - by JanD
    Hi, I have a simple code fragment in JS working with prototype inheritance. function object(o) { function F() {} F.prototype = o; return new F(); } //the following code block has a alternate version var mammal = { color: "brown", getColor: function() { return this.color; } } var myCat = object(mammal); myCat.meow = function(){return "meow";} that worked fine but adding this: mammal.prototype.kindOf = "predator"; does not. ("mammal.prototype is undefined") Since I guessed that object maybe have no prototype I rewrote it, replacing the var mammal={... block with: function mammal() { this.color = "brown"; this.getColor = function() { return this.color; } } which gave me a bunch of other errors: "Function.prototype.toString called on incompatible object" and if I try to call _myCat.getColor() "myCat.getColor is not a function" Now I am totally confused. After reading Crockford, and Flanagan I did not get the solution for the errors. So it would be great if somebody knows... - why is the prototype undefined in the first example (which is foremost concern; I thought the prototype of explicitly set in the object() function) - why get I these strange errors trying to use the mammal function as prototype object in the object() function? Edit by the Creator of the Question: These two links helped a lot too: Prototypes_in_JavaScript on the spheredev wiki explains the way the prototype property works relativily simple. What it lacks is some try-out code examples. Some good examples are provided by Morris John's Article. I personally find the explanations are not that easy as in the first link, but still very good. The most difficult part even after I actually got it is really not to confuse the .prototype propery with the internal [[Prototype]] of an object.

    Read the article

  • Generics vs inheritance (whenh no collection classes are involved)

    - by Ram
    This is an extension of this questionand probably might even be a duplicate of some other question(If so, please forgive me). I see from MSDN that generics are usually used with collections The most common use for generic classes is with collections like linked lists, hash tables, stacks, queues, trees and so on where operations such as adding and removing items from the collection are performed in much the same way regardless of the type of data being stored. The examples I have seen also validate the above statement. Can someone give a valid use of generics in a real-life scenario which does not involve any collections ? Pedantically, I was thinking about making an example which does not involve collections public class Animal<T> { public void Speak() { Console.WriteLine("I am an Animal and my type is " + typeof(T).ToString()); } public void Eat() { //Eat food } } public class Dog { public void WhoAmI() { Console.WriteLine(this.GetType().ToString()); } } and "An Animal of type Dog" will be Animal<Dog> magic = new Animal<Dog>(); It is entirely possible to have Dog getting inherited from Animal (Assuming a non-generic version of Animal)Dog:Animal Therefore Dog is an Animal Another example I was thinking was a BankAccount. It can be BankAccount<Checking>,BankAccount<Savings>. This can very well be Checking:BankAccount and Savings:BankAccount. Are there any best practices to determine if we should go with generics or with inheritance ?

    Read the article

  • Javascript: prototypeal inheritance and the prototype property

    - by JanD
    Hi, I have a simple code fragment in JS working with prototype inheritance. function object(o) { function F() {} F.prototype = o; return new F(); } //the following code block has a alternate version var mammal = { color: "brown", getColor: function() { return this.color; } } var myCat = object(mammal); myCat.meow = function(){return "meow";} that worked fine but adding this: mammal.prototype.kindOf = "predator"; does not. ("mammal.prototype is undefined") Since I guessed that object maybe have no prototype I rewrote it, replacing the var mammal={... block with: function mammal() { this.color = "brown"; this.getColor = function() { return this.color; } } which gave me a bunch of other errors: "Function.prototype.toString called on incompatible object" and if I try to call _myCat.getColor() "myCat.getColor is not a function" Now I am totally confused. After reading Crockford, and Flanagan I did not get the solution for the errors. So it would be great if somebody knows... - why is the prototype undefined in the first example (which is foremost concern; I thought the prototype of explicitly set in the object() function) - why get I these strange errors trying to use the mammal function as prototype object in the object() function?

    Read the article

  • Objective-C protocol vs inheritance vs extending?

    - by ryanjm.mp
    I have a couple classes that have nearly identical code. Only a string or two is different between them. What I would like to do is to make them "x" from another class that defines those functions and then uses constants or something else to define those strings that are different. I'm not sure if "x" is inheritance or extending or what. That is what I need help with. For example: objectA.m: -(void)helloWorld { NSLog("Hello %@",child.name); } objectBob.m: #define name @"Bob" objectJoe.m #define name @"Joe" (I'm not sure if it's legal to define strings, but this gets the point across) It would be ideal if objectBob.m and objectJoe.m didn't have to even define the methods, just their relationship to objectA.m. Is there any way to do something like this? It is kind of like protocol, except in reverse, I want the "protocol" to actually define the functions. If all else fails I'll just make objectA.m: -(void)helloWorld:(NSString *name) { NSLog("Hello %@",name); } And have the other files call that function (and just #import objectA.m).

    Read the article

  • Inheritance inside a template - public members become invisible?

    - by Juliano
    I'm trying to use inheritance among classes defined inside a class template (inner classes). However, the compiler (GCC) is refusing to give me access to public members in the base class. Example code: template <int D> struct Space { struct Plane { Plane(Space& b); virtual int& at(int y, int z) = 0; Space& space; /* <= this member is public */ }; struct PlaneX: public Plane { /* using Plane::space; */ PlaneX(Space& b, int x); int& at(int y, int z); const int cx; }; int& at(int x, int y, int z); }; template <int D> int& Space<D>::PlaneX::at(int y, int z) { return space.at(cx, y, z); /* <= but it fails here */ }; Space<4> sp4; The compiler says: file.cpp: In member function ‘int& Space::PlaneX::at(int, int)’: file.cpp:21: error: ‘space’ was not declared in this scope If using Plane::space; is added to the definition of class PlaneX, or if the base class member is accessed through the this pointer, or if class Space is changed to a non-template class, then the compiler is fine with it. I don't know if this is either some obscure restriction of C++, or a bug in GCC (GCC versions 4.4.1 and 4.4.3 tested). Does anyone have an idea?

    Read the article

< Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >