Search Results

Search found 29193 results on 1168 pages for 'sql merge'.

Page 390/1168 | < Previous Page | 386 387 388 389 390 391 392 393 394 395 396 397  | Next Page >

  • Merge items in nanoc

    - by Gordon Potter
    I have been trying to use nanoc for generating a static website. I need to organize a complex arrangement pages I want to keep my content DRY. How does the concept of includes or merges work within the nanoc system? I have read the docs but I can't seem to find what I want. For example: how can I take two partial content items and merge them together into a new content item. In staticmatic you can do some like the following inside your page. = partial('partials/shared/navigation') How would a similar convention work within nanoc?

    Read the article

  • SQL Server "Long running transaction" performance counter: why no workee?

    - by Sleepless
    Please explain to me the following observation: I have the following piece of T-SQL code that I run from SSMS: BEGIN TRAN SELECT COUNT (*) FROM m WHERE m.[x] = 123456 or m.[y] IN (SELECT f.x FROM f) SELECT COUNT (*) FROM m WHERE m.[x] = 123456 or m.[y] IN (SELECT f.x FROM f) COMMIT TRAN The query takes about twenty seconds to run. I have no other user queries running on the server. Under these circumstances, I would expect the performance counter "MSSQL$SQLInstanceName:Transactions\Longest Transaction Running Time" to rise constantly up to a value of 20 and then drop rapidly. Instead, it rises to around 12 within two seconds and then oscillates between 12 and 14 for the duration of the query after which it drops again. According to the MS docs, the counter measures "The length of time (in seconds) since the start of the transaction that has been active longer than any other current transaction." But apparently, it doesn't. What gives?

    Read the article

  • Receiving "MERGE" 200 OK error when committing using trac-post-commit-hook

    - by Lyon Blecher
    When running a commit with the trac-post-commit-hook I receive a MERGE 200 OK error, I understand that this means that the commit has succeeded on the server but the file status has not updated on my local machine. But I can't find anyway to fix this issue. Would this be a problem with my setup or something in the script. I'm using stock standard script from the trac site, I'm committing through tortoiseSVN to VisualSVN Server which is hosted on a windows 2008 server. When I run the script through a command line I receive no errors, I only receive this error through TortoiseSVN.

    Read the article

  • how to merge xml string to main xml document object

    - by CliffC
    how can i merge the following xml string <employee> <name>cliff</name> </employee> to my existing xml document object XmlDocument xmlDoc = new XmlDocument(); XmlElement xmlCompany = xmlDoc.CreateElement("Company"); the final output should look like <Company> <employee> <name>cliff</name> </employee> </Company> thanks

    Read the article

  • Subclipse > Accidental Merge Conflict Resolution

    - by DTS
    I'm trying to merge changes from one branch into another using Subclipse. On a particular file in a particular subdirectory, I had a file conflict and edited the conflicts via the context menu option for this. However, when I went to resolve the conflict I apparently chose the wrong option and was left with the original unmerged file in my branch. Since then, I can no longer get this file back into a conflicted state so I can resolve this issue properly. I've tried deleting the file and the directory that contains it, to no avail. Any ideas?

    Read the article

  • Can't find how to import as one object or how to merge

    - by Aaron
    I need write a script in blender that creates some birds which fly around some obstacles. The problem is that I need to import a pretty large Collada model (a building) which consists of multiple objects. The import works fine, but the the building is not seen as 1 object. I need to resize and move this building, but I can only get the last object in the building (which is a camera)... Does anyone know how to merge this building in 1 object, group, variable... so I can resize and move it correctly? Part of the code I used: bpy.ops.wm.collada_import(filepath="C:\\Users\\me\\building.dae") building= bpy.context.object building.scale = (100, 100, 100) building.name = "building"

    Read the article

  • Need to merge multiple pdf's into a single PDF with Table Of Contents sections

    - by Jason
    Will have 50-100 single PDF's that we'll be generating with a php script. PDF's are generally grouped into groups of 10-20. Each group needs to have it's own Table of Contents or Index, and then there also needs to be a Master Table of Contents or Index at the beginning. Or if that is too difficult we could get away with a single Table of Contents at the beginning. What's the best way to go about this? Will we need to create the Table of Contents and then export that to PDF and append it to the beginning and mash the rest of the files after that? Or is there a better solution? And what's the best tool for us to merge the pdf's? Will be running on a Linux server.

    Read the article

  • How do I cast <T> to varbinary and be still be able to perform a CONVERT on the sql side? Implicatio

    - by Biff MaGriff
    Hello, I'm writing this application that will allow a user to define custom quizzes and then allow another user to respond to the questions. Each question of the quiz has a corresponding datatype. All the responses to all the questions are stored vertically in my [Response] table. I currently use 2 fields to store the response. //Response schema ResponseID int QuizPersonID int QuestionID int ChoiceID int //maps to Choice table, used for drop down lists ChoiceValue varbinary(MAX) //used to store a user entered value I'm using .net 3.5 C# SQL Server 2008. I'm thinking that I would want to store different datatypes in the same field and then in my SQL report proc I would CONVERT to the proper datatype. I'm thinking this is ideal because I only have to check one field. I'm also thinking it might be more trouble than it is worth. I think my other options are to; store the data as strings in the db (yuck), or to have a column for each datatype I might use. So what I would like to know is, how would I format my datatypes in C# so that they can be converted properly in SQL? What is the performance hit for converting in SQL? Should I just make a whole wack of columns for each datatype?

    Read the article

  • Ideas for SVN/SQL/PHP/Linux Dev Enviroment Supporting Multiple Isolated Environments?

    - by jpganz18
    I am trying to create a "dev" for my users. In that environment they would access to their own account of PHPMyAdmin, SQL, Subversion and FTP which is not a big problem, but I would like to emulate like if each one would be in their own server. I mean so that they could change the PHP configuration (for example) and would be done only in its own environment. Any idea how to do this? Do I have to make something "special" at the installations of my server or something like that?

    Read the article

  • How to merge two different Makefiles?

    - by martijnn2008
    I have did some reading on "Merging Makefiles", one suggest I should leave the two Makefiles separate in different folders [1]. For me this look counter intuitive, because I have the following situation: I have 3 source files (main.cpp flexibility.cpp constraints.cpp) one of them (flexibility.cpp) is making use of the COIN-OR Linear Programming library (Clp) When installing this library on my computer it makes sample Makefiles, which I have adjust the Makefile and it currently makes a good working binary. # Copyright (C) 2006 International Business Machines and others. # All Rights Reserved. # This file is distributed under the Eclipse Public License. # $Id: Makefile.in 726 2006-04-17 04:16:00Z andreasw $ ########################################################################## # You can modify this example makefile to fit for your own program. # # Usually, you only need to change the five CHANGEME entries below. # ########################################################################## # To compile other examples, either changed the following line, or # add the argument DRIVER=problem_name to make DRIVER = main # CHANGEME: This should be the name of your executable EXE = clp # CHANGEME: Here is the name of all object files corresponding to the source # code that you wrote in order to define the problem statement OBJS = $(DRIVER).o constraints.o flexibility.o # CHANGEME: Additional libraries ADDLIBS = # CHANGEME: Additional flags for compilation (e.g., include flags) ADDINCFLAGS = # CHANGEME: Directory to the sources for the (example) problem definition # files SRCDIR = . ########################################################################## # Usually, you don't have to change anything below. Note that if you # # change certain compiler options, you might have to recompile the # # COIN package. # ########################################################################## COIN_HAS_PKGCONFIG = TRUE COIN_CXX_IS_CL = #TRUE COIN_HAS_SAMPLE = TRUE COIN_HAS_NETLIB = #TRUE # C++ Compiler command CXX = g++ # C++ Compiler options CXXFLAGS = -O3 -pipe -DNDEBUG -pedantic-errors -Wparentheses -Wreturn-type -Wcast-qual -Wall -Wpointer-arith -Wwrite-strings -Wconversion -Wno-unknown-pragmas -Wno-long-long -DCLP_BUILD # additional C++ Compiler options for linking CXXLINKFLAGS = -Wl,--rpath -Wl,/home/martijn/Downloads/COIN/coin-Clp/lib # C Compiler command CC = gcc # C Compiler options CFLAGS = -O3 -pipe -DNDEBUG -pedantic-errors -Wimplicit -Wparentheses -Wsequence-point -Wreturn-type -Wcast-qual -Wall -Wno-unknown-pragmas -Wno-long-long -DCLP_BUILD # Sample data directory ifeq ($(COIN_HAS_SAMPLE), TRUE) ifeq ($(COIN_HAS_PKGCONFIG), TRUE) CXXFLAGS += -DSAMPLEDIR=\"`PKG_CONFIG_PATH=/home/martijn/Downloads/COIN/coin-Clp/lib64/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/lib/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/share/pkgconfig: pkg-config --variable=datadir coindatasample`\" CFLAGS += -DSAMPLEDIR=\"`PKG_CONFIG_PATH=/home/martijn/Downloads/COIN/coin-Clp/lib64/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/lib/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/share/pkgconfig: pkg-config --variable=datadir coindatasample`\" else CXXFLAGS += -DSAMPLEDIR=\"\" CFLAGS += -DSAMPLEDIR=\"\" endif endif # Netlib data directory ifeq ($(COIN_HAS_NETLIB), TRUE) ifeq ($(COIN_HAS_PKGCONFIG), TRUE) CXXFLAGS += -DNETLIBDIR=\"`PKG_CONFIG_PATH=/home/martijn/Downloads/COIN/coin-Clp/lib64/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/lib/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/share/pkgconfig: pkg-config --variable=datadir coindatanetlib`\" CFLAGS += -DNETLIBDIR=\"`PKG_CONFIG_PATH=/home/martijn/Downloads/COIN/coin-Clp/lib64/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/lib/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/share/pkgconfig: pkg-config --variable=datadir coindatanetlib`\" else CXXFLAGS += -DNETLIBDIR=\"\" CFLAGS += -DNETLIBDIR=\"\" endif endif # Include directories (we use the CYGPATH_W variables to allow compilation with Windows compilers) ifeq ($(COIN_HAS_PKGCONFIG), TRUE) INCL = `PKG_CONFIG_PATH=/home/martijn/Downloads/COIN/coin-Clp/lib64/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/lib/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/share/pkgconfig: pkg-config --cflags clp` else INCL = endif INCL += $(ADDINCFLAGS) # Linker flags ifeq ($(COIN_HAS_PKGCONFIG), TRUE) LIBS = `PKG_CONFIG_PATH=/home/martijn/Downloads/COIN/coin-Clp/lib64/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/lib/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/share/pkgconfig: pkg-config --libs clp` else ifeq ($(COIN_CXX_IS_CL), TRUE) LIBS = -link -libpath:`$(CYGPATH_W) /home/martijn/Downloads/COIN/coin-Clp/lib` libClp.lib else LIBS = -L/home/martijn/Downloads/COIN/coin-Clp/lib -lClp endif endif # The following is necessary under cygwin, if native compilers are used CYGPATH_W = echo # Here we list all possible generated objects or executables to delete them CLEANFILES = clp \ main.o \ flexibility.o \ constraints.o \ all: $(EXE) .SUFFIXES: .cpp .c .o .obj $(EXE): $(OBJS) bla=;\ for file in $(OBJS); do bla="$$bla `$(CYGPATH_W) $$file`"; done; \ $(CXX) $(CXXLINKFLAGS) $(CXXFLAGS) -o $@ $$bla $(LIBS) $(ADDLIBS) clean: rm -rf $(CLEANFILES) .cpp.o: $(CXX) $(CXXFLAGS) $(INCL) -c -o $@ `test -f '$<' || echo '$(SRCDIR)/'`$< .cpp.obj: $(CXX) $(CXXFLAGS) $(INCL) -c -o $@ `if test -f '$<'; then $(CYGPATH_W) '$<'; else $(CYGPATH_W) '$(SRCDIR)/$<'; fi` .c.o: $(CC) $(CFLAGS) $(INCL) -c -o $@ `test -f '$<' || echo '$(SRCDIR)/'`$< .c.obj: $(CC) $(CFLAGS) $(INCL) -c -o $@ `if test -f '$<'; then $(CYGPATH_W) '$<'; else $(CYGPATH_W) '$(SRCDIR)/$<'; fi` The other Makefile compiles a lot of code and makes use of bison and flex. This one is also made by someone else. I am able to alter this Makefile when I want to add some code. This Makefile also makes a binary. CFLAGS=-Wall LDLIBS=-LC:/GnuWin32/lib -lfl -lm LSOURCES=lex.l YSOURCES=grammar.ypp CSOURCES=debug.cpp esta_plus.cpp heap.cpp main.cpp stjn.cpp timing.cpp tmsp.cpp token.cpp chaining.cpp flexibility.cpp exceptions.cpp HSOURCES=$(CSOURCES:.cpp=.h) includes.h OBJECTS=$(LSOURCES:.l=.o) $(YSOURCES:.ypp=.tab.o) $(CSOURCES:.cpp=.o) all: solver solver: CFLAGS+=-g -O0 -DDEBUG solver: $(OBJECTS) main.o debug.o g++ $(CFLAGS) -o $@ $^ $(LDLIBS) solver.release: CFLAGS+=-O5 solver.release: $(OBJECTS) main.o g++ $(CFLAGS) -o $@ $^ $(LDLIBS) %.o: %.cpp g++ -c $(CFLAGS) -o $@ $< lex.cpp: lex.l grammar.tab.cpp grammar.tab.hpp flex -o$@ $< %.tab.cpp %.tab.hpp: %.ypp bison --verbose -d $< ifneq ($(LSOURCES),) $(LSOURCES:.l=.cpp): $(YSOURCES:.y=.tab.h) endif -include $(OBJECTS:.o=.d) clean: rm -f $(OBJECTS) $(OBJECTS:.o=.d) $(YSOURCES:.ypp=.tab.cpp) $(YSOURCES:.ypp=.tab.hpp) $(YSOURCES:.ypp=.output) $(LSOURCES:.l=.cpp) solver solver.release 2>/dev/null .PHONY: all clean debug release Both of these Makefiles are, for me, hard to understand. I don't know what they exactly do. What I want is to merge the two of them so I get only one binary. The code compiled in the second Makefile should be the result. I want to add flexibility.cpp and constraints.cpp to the second Makefile, but when I do. I get the problem following problem: flexibility.h:4:26: fatal error: ClpSimplex.hpp: No such file or directory #include "ClpSimplex.hpp" So the compiler can't find the Clp library. I also tried to copy-paste more code from the first Makefile into the second, but it still gives me that same error. Q: Can you please help me with merging the two makefiles or pointing out a more elegant way? Q: In this case is it indeed better to merge the two Makefiles? I also tried to use cmake, but I gave upon that one quickly, because I don't know much about flex and bison.

    Read the article

  • SQL 2008 Mirroring, how to failover from the mirror database?

    - by Luis
    I have configured a database mirroring setup in SQL 2008 using the High-safety, Synchronous mode, without automatic failover. I don't have a Witness instance. Regarding high availability, I understand Mirroring is a better strategy than Log Shipping (faster and smoother failover), and cheaper than Clustering (because of license and hardware costs). According to the MS docs, to do the failover you need to access to the Principal database and in the "Mirror" options click the "Failover" button. But I want to do this from the Mirror database, because what would be the benefit as all this setup is being done in case the Principal server knocks down? Evidently I am missing something. If Mirroring is not a solution for server downtime (as would be Clustering, if I understand correctly), then which practical (i.e. real world examples) cases would benefit from Mirroring for high-availability purposes? Thank you very much for your response! I really need some enlightment.

    Read the article

  • What can cause SQL 2008 Transaction Log Shipping to stop functioning?

    - by Rick
    I read somewhere that doing a backup or when Maintenance Plan runs can cause Log Shipping to stop functioning. Is this true? What should we watch out for once our Transaction Log Shipping is in place that could stop it? A Log Shipping test we were doing between two databases on the same SQL 2008 server appeared to stop working without any error. When we checked the History of the LSRestore_* job it was always ignoring the new *.trn files. Any suggestions? Thanks.

    Read the article

  • Advise about performance for local or remote SQL Server?

    - by TruMan1
    I currently have my web server and SQL Express / MySQL server on the same server. It is on a VPS. I have been having problems with my hosting so I am thinking of separating the web and db server into 2 VPS servers. Does anyone recommend this? I am worried that changing my setup from a local DB server to a remote one will degrade performance heavily. They will not be on the same network, but will reference each other via an IP address. Anything I should be aware of?

    Read the article

  • SQL Server: One 12-drive RAID-10 array or 2 arrays of 8-drives and 4-drives

    - by ben
    Setting up a box for SQL Server 2008, which would give the best performance (heavy OLTP)? The more drives in a RAID-10 array the better performance, but will losing 4 drives to dedicate them to the transaction logs give us more performance. 12-drives in RAID-10 plus one hot spare. OR 8-drives in RAID-10 for database and 4-drives RAID-10 for transaction logs plus 2 hot spares (one for each array). We have 14-drive slots to work with and it's an older PowerVault that doesn't support global hot spares.

    Read the article

  • Is it possible to "merge" the values of multiple records into a single field without using a stored

    - by j0rd4n
    A co-worker posed this question to me, and I told them, "No, you'll need to write a sproc for that". But I thought I'd give them a chance and put this out to the community. Essentially, they have a table with keys mapping to multiple values. For a report, they want to aggregate on the key and "mash" all of the values into a single field. Here's a visual: --- ------- Key Value --- ------- 1 A 1 B 1 C 2 X 2 Y The result would be as follows: --- ------- Key Value --- ------- 1 A,B,C 2 X,Y They need this in SQLServer 2005. Again, I think they need to write a stored procedure, but if anyone knows a magic out-of-the-box function that does this, I'd be impressed.

    Read the article

  • How to merge two FBOs?

    - by DevDevDev
    OK so I have 4 buffers, 3 FBOs and a render buffer. Let me explain. I have a view FBO, which will store the scene before I render it to the render buffer. I have a background buffer, which contains the background of the scene. I have a user buffer, which the user manipulates. When the user makes some action I draw to the user buffer, using some blending. Then to redraw the whole scene what I want to do is clear the view buffer, draw the background buffer to the view buffer, change the blending, then draw the user buffer to the view buffer. Finally render the view buffer to the render buffer. However I can't figure out how to draw a FBO to another FBO. What I want to do is essentially merge and blend two FBOs, but I can't figure out how! I'm very new to OpenGL ES, so thanks for all the help.

    Read the article

  • How to merge two tables based on common column and sort the results by date

    - by techiepark
    Hello friends, I have two mysql tables and i want to merge the results of these two tables based on the common column rev_id. The merged results should be sorted by the date of two tables. Please help me. CREATE TABLE `reply` ( `id` int(3) NOT NULL auto_increment, `name` varchar(25) NOT NULL default '', `member_id` varchar(45) NOT NULL, `rev_id` int(3) NOT NULL default '0', `description` text, `post_date` timestamp NOT NULL default CURRENT_TIMESTAMP on update CURRENT_TIMESTAMP, `flag` char(2) NOT NULL default 'N', PRIMARY KEY (`id`), KEY `member_id` (`member_id`) ) ENGINE=MyISAM; CREATE TABLE `comment` ( `com_id` int(8) NOT NULL auto_increment, `rev_id` int(5) NOT NULL default '0', `member_id` varchar(50) NOT NULL, `comm_desc` text NOT NULL, `date_created` timestamp NOT NULL default CURRENT_TIMESTAMP on update CURRENT_TIMESTAMP, PRIMARY KEY (`com_id`), KEY `member_id` (`member_id`) ) ENGINE=MyISAM;

    Read the article

  • How to configure outgoing connections from an SQL stored procedure?

    - by Peter Vestberg
    I am working on a .NET project which uses Microsoft SQL server. In this project, I need a CLR stored procedure (written in C#) that uses a remote web service. So, when the stored procedure is executed on the SQL server, it makes web service calls and thus sends packets to a remote location. The problem is that when executing the SP I get: "System.Net.WebException: The request failed with HTTP status 403: Forbidden." The database user has full permission, the deployed CLR assembly and SP are even marked "unsafe", I tried signing it etc., so any of that is not causing the problem. When I am executing the very same C# code, but from a simple console application instead of as a SP, it all works fine. So I started to suspect a network related problem and had a packet sniffer running when executing both the SP and the console app version. What I realized was that the packets sent out had different destination IP addresses: the console app sent the packets directly to the web service IP while the SP sent the packets to a proxy server we use in our company. Due to network policies the latter is not allowed and that explains the "403 Forbidden" exception. So my question boils down to this: How can I configure the SP/MS SQL server to NOT use that proxy? I want it to send the packets directly to the web service IP, just like the test console app. (again, the C# code is the same , so it's not a programming matter). I've disabled all proxy settings in Internet Explorer in case the SQL server inherits these settings or something. However, no luck. Any help would be greatly appreciated! Best regards, Peter

    Read the article

  • @OneToMany property null in Entity after (second) merge

    - by iNPUTmice
    Hi, I'm using JPA (with Hibernate) and Gilead in a GWT project. On the server side I have this method and I'm calling this method twice with the same "campaign". On the second call it throws a null pointer exception in line 4 "campaign.getTextAds()" public List<WrapperTextAd> getTextAds(WrapperCampaign campaign) { campaign = em.merge(campaign); System.out.println("getting textads for "+campaign.getName()); for(WrapperTextAd textad: campaign.getTextAds()) { //do nothing } return new ArrayList<WrapperTextAd>(campaign.getTextAds()); } The code in WrapperCampaign Entity looks like this @OneToMany(mappedBy="campaign") public Set<WrapperTextAd> getTextAds() { return this.textads; }

    Read the article

  • How do I merge a local branch into TFS

    - by Johnny
    hi, I did a stupid thing and branched my project on my local disk instead of doing it on the TFS. So now I have two projects on my disk: the old one which has TFS bindings and the new, which doesn't. I want to merge those changes back into the TFS project. How would I go about doing that? I can't do Compare because my local branch has no TFS bindings. There should be some way to compare the differences between the two projects locally and then meld the differences into the old project and check-in, but I can't find an easy way of doing that. Any other solutions?

    Read the article

  • SQL Server 2008 Web edition in a hosting plan?

    - by Simon
    Do any hosting companies offer SQL Server 2008 Web edition in a hosting plan. GoDaddy for instance offers Standard/Enterprise editions which raise the price by $200 or so a month. I've tried a few hosting companies and can't find the web edition available. Why not? The web edition is supposed to be only $15/month - but I was hoping to be able to get this pricing through a dedicated server and not have to go off and separately get the licensing. I don't even know if its possible to buy just one copy!?

    Read the article

  • Recover history from foolish git-svn merge

    - by Gregg Lind
    the players: master: the svn branch (actual, not local trackign) mybranch: a local branch My mistake: [master] git svn rebase [master] git merge mybranch [master] git svn dcommit I did this twice. Is there a way I can remedy all this? I was thinking something like: git checkout --hard [commit before the merging] git dcommit # that to the svn? git rebase mybranch git dcommit But this doesn't seem to work. (I know I should a. working from a local tracking branch and b. have rebased rather than merged) I'm in the frantic / willing to send beer to respondents stage :)

    Read the article

  • Merge overlapping triangles into a polygon

    - by nornagon
    I've got a bunch of overlapping triangles from a 3D model projected into a 2D plane. I need to merge each island of touching triangles into a closed, non-convex polygon. The resultant polygons shouldn't have any holes in them (since the source data doesn't). Many of the source triangles share (floating point identical) edges with other triangles in the source data. What's the easiest way to do this? Performance isn't particularly important, since this will be done at design time.

    Read the article

  • Javascript array - merge two arrays into one

    - by estrar
    I have two arrays, one with name of the country and one with the currency type. I would like to merge these together and use the country key instead of the currencies array. What would be the best way to accomplish this? This is what my code looks like now: var country = new Array(); country["SEK"] = 'Sweden'; country["USD"] = 'United states'; var currencies = ["SEK","USD"]; var options = ''; for (var i = 0; i < currencies.length; i++) { options += '<option value="' + currencies[i] + '" id="' + currencies[i] + '">' + currencies[i] + ' (' + country[currencies[i]] + ')</option>'; }

    Read the article

  • Backup database from default (unnamed) sql server instance with powershell.

    - by sparks
    Trying to connect to an instance of SQL Server 2008 on a server we'll call Sputnik. There are no firewalls in between the two devices. Right now I'm just trying to list databases [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo") | Out-Null [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoEnum") | Out-Null $servername = "Sputnik" $remoteServer = New-Object("Microsoft.SqlServer.Management.Smo.Server") $servername $remoteServer.databases The following error message occurs: The following exception was thrown when trying to enumerate the collection: "Failed to connect to server Sputnik.". At line:1 char:15 + $remoteServer. <<<< databases + CategoryInfo : NotSpecified: (:) [], ExtendedTypeSystemException + FullyQualifiedErrorId : ExceptionInGetEnumerator

    Read the article

< Previous Page | 386 387 388 389 390 391 392 393 394 395 396 397  | Next Page >