Search Results

Search found 418 results on 17 pages for 'uint'.

Page 3/17 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Member function overloading/template specialization issue

    - by Ferruccio
    I've been trying to call the overloaded table::scan_index(std::string, ...) member function without success. For the sake of clarity, I have stripped out all non-relevant code. I have a class called table which has an overloaded/templated member function named scan_index() in order to handle strings as a special case. class table : boost::noncopyable { public: template <typename T> void scan_index(T val, std::function<bool (uint recno, T val)> callback) { // code } void scan_index(std::string val, std::function<bool (uint recno, std::string val)> callback) { // code } }; Then there is a hitlist class which has a number of templated member functions which call table::scan_index(T, ...) class hitlist { public: template <typename T> void eq(uint fieldno, T value) { table* index_table = db.get_index_table(fieldno); // code index_table->scan_index<T>(value, [&](uint recno, T n)->bool { // code }); } }; And, finally, the code which kicks it all off: hitlist hl; // code hl.eq<std::string>(*fieldno, p1.to_string()); The problem is that instead of calling table::scan_index(std::string, ...), it calls the templated version. I have tried using both overloading (as shown above) and a specialized function template (below), but nothing seems to work. After staring at this code for a few hours, I feel like I'm missing something obvious. Any ideas? template <> void scan_index<std::string>(std::string val, std::function<bool (uint recno, std::string val)> callback) { // code }

    Read the article

  • WinRT WebView and Cookies

    - by javarg
    Turns out that WebView Control in WinRT is much more limited than it’s counterpart in WPF/Silverlight. There are some great articles out there in how to extend the control in order for it to support navigation events and some other features. For a personal project I'm working on, I needed to grab cookies a Web Site generated for the user. Basically, after a user authenticated to a Web Site I needed to get the authentication cookies and generate some extra requests on her behalf. In order to do so, I’ve found this great article about a similar case using SharePoint and Azure ACS. The secret is to use a p/invoke to native InternetGetCookieEx to get cookies for the current URL displayed in the WebView control.   void WebView_LoadCompleted(object sender, NavigationEventArgs e) { var urlPattern = "http://someserver.com/somefolder"; if (e.Uri.ToString().StartsWith(urlPattern)) { var cookies = InternetGetCookieEx(e.Uri.ToString()); // Do something with the cookies } } static string InternetGetCookieEx(string url) { uint sizeInBytes = 0; // Gets capacity length first InternetGetCookieEx(url, null, null, ref sizeInBytes, INTERNET_COOKIE_HTTPONLY, IntPtr.Zero); uint bufferCapacityInChars = (uint)Encoding.Unicode.GetMaxCharCount((int)sizeInBytes); // Now get cookie data var cookieData = new StringBuilder((int)bufferCapacityInChars); InternetGetCookieEx(url, null, cookieData, ref bufferCapacityInChars, INTERNET_COOKIE_HTTPONLY, IntPtr.Zero); return cookieData.ToString(); }   Function import using p/invoke follows: const int INTERNET_COOKIE_HTTPONLY = 0x00002000; [DllImport("wininet.dll", CharSet = CharSet.Unicode, SetLastError = true)] static extern bool InternetGetCookieEx(string pchURL, string pchCookieName, StringBuilder pchCookieData, ref System.UInt32 pcchCookieData, int dwFlags, IntPtr lpReserved); Enjoy!

    Read the article

  • GetData() error creating framebuffer

    - by Lelezeus
    I'm currently porting a game written in C# with XNA library to Android with Monogame. I have a Texture2D and i'm trying to get an array of uint in this way: Texture2d textureDeform = game.Content.Load<Texture2D>("Texture/terrain"); uint[] pixelDeformData = new uint[textureDeform.Width * textureDeform.Height]; textureDeform.GetData(pixelDeformData, 0, textureDeform.Width * textureDeform.Height); I get the following exception: System.Exception: Error creating framebuffer: Zero at Microsoft.Xna.Framework.Graphics.Texture2D.GetTextureData (Int32 ThreadPriorityLevel) [0x00000] in :0 I found that the problem is in private byte[] GetTextureData(int ThreadPriorityLevel) creating the framebuffer: private byte[] GetTextureData(int ThreadPriorityLevel) { int framebufferId = -1; int renderBufferID = -1; GL.GenFramebuffers(1, ref framebufferId); // framebufferId is still -1 , why can't be created? GraphicsExtensions.CheckGLError(); GL.BindFramebuffer(All.Framebuffer, framebufferId); GraphicsExtensions.CheckGLError(); //renderBufferIDs = new int[currentRenderTargets]; GL.GenRenderbuffers(1, ref renderBufferID); GraphicsExtensions.CheckGLError(); // attach the texture to FBO color attachment point GL.FramebufferTexture2D(All.Framebuffer, All.ColorAttachment0, All.Texture2D, this.glTexture, 0); GraphicsExtensions.CheckGLError(); // create a renderbuffer object to store depth info GL.BindRenderbuffer(All.Renderbuffer, renderBufferID); GraphicsExtensions.CheckGLError(); GL.RenderbufferStorage(All.Renderbuffer, All.DepthComponent24Oes, Width, Height); GraphicsExtensions.CheckGLError(); // attach the renderbuffer to depth attachment point GL.FramebufferRenderbuffer(All.Framebuffer, All.DepthAttachment, All.Renderbuffer, renderBufferID); GraphicsExtensions.CheckGLError(); All status = GL.CheckFramebufferStatus(All.Framebuffer); if (status != All.FramebufferComplete) throw new Exception("Error creating framebuffer: " + status); ... } The frameBufferId is still -1, seems that framebuffer could not be generated and I don't know why. Any help would be appreciated, thank you in advance.

    Read the article

  • failbit is being set and I can't figure out why

    - by felipedrl
    I'm writing a MIDI file loader. Everything is going fine until at some track I get a failbit exception while trying to read from file. I can't figure out why, I've checked the file size and it's ok too. Upon checking "errno" and it returns "0". Any ideas? Thanks. The snippet follows: file.read(reinterpret_cast<char*>(&mHeader.id), sizeof(MidiHeader)); mTracks = new MidiTrack[mHeader.nTracks]; for (uint i = 0; i < mHeader.nTracks; ++i) { // this read fails on 6th i. I've checked hexadecimal file and it's // ok so far. file.read(reinterpret_cast<char*>(&mTracks[i].id), sizeof(uint)); if (file.fail()) { std::cerr << errno << std::endl; massert(false); } massert(mTracks[i].id == 0x6B72544D); file.read(reinterpret_cast<char*>(&mTracks[i].size), sizeof(uint)); mTracks[i].size = swapBytes(mTracks[i].size); mTracks[i].data = new char[mTracks[i].size]; file.read(mTracks[i].data, mTracks[i].size * sizeof(char)); totalBytesRead += 8 + mTracks[i].size; massert(totalBytesRead <= fileSize); }

    Read the article

  • Problem getting correct parameters for C# P/Invoke call to C++ dll

    - by Jim Jones
    Trying to Interop a functionality from the Outside In API from Oracle. Have the following function: SCCERR EXOpenExport {VTHDOC hDoc, VTDWORD dwOutputId, VTDWORD dwSpecType, VTLPVOID pSpec, VTDWORD dwFlags, VTSYSPARAM dwReserved, VTLPVOID pCallbackFunc, VTSYSPARAM dwCallbackData, VTLPHEXPORT phExport); From the header files I reduced the parameters to: typedef VTSYSPARAM VTHDOC, VTLPHDOC * typedef DWORD_PTR VTSYSPARAM typedef unsigned long DWORD_PTR typedef unsigned long VTDWORD typedef VTVOID* VTLPVOID #define VTVOID void typedef VTHDOC VTHEXPORT, *VTLPEXPORT These are for 32 bit windows Going through the header files, the example programs, and the documentation I found: 1. That pSpec could be a pointer to a buffer or NULL, so I set it to a IntPtr.Zero (documentation). 2. That dwFlags and dwReserved according to the documentation "Must be set by the developer to 0". 3. That pCallbackFunc can be set to NULL if I don't want to handle callbacks. 4. That the last two are based on structs that I wrote C# wrappers for using the [StructLayout(LayoutKind.Sequential)]. Then instatiated an instance and generated the parameters by first creating a IntPtr with Marshal.AllocHGlobal(Marshal.SizeOf(instance)), then getting the address value which is passed as a uint for dwCallbackData and a IntPtr for phExport. The final parameter list is as follows: 1. phDoc as a IntPtr which was loaded with an address by the DAOpenDocument function called before. 2. dwOutputId as uint set to 1535 which represents FI_JPEGFIF 3. dwSpecType as int set to 2 which represents IOTYPE_ANSIPATH 4. pSpec as an IntPtr.Zero where the output will be written 5. dwFlags as uint set to 0 as directed 6. dwReserved as uint set to 0 as directed 7. pCallbackFunc as IntPtr set to NULL as I will handle results 8. dwCallBackDate as uint the address of a buffer for a struct 9. phExport as IntPtr to another struct buffer still get an undefined error from the API. Meaning that the call returns a 961 which is not defined in any of the header files. In the past I have gotten this when my choice of parameter types are incorrect. I started out using Interop Assistant which was helpful in learning how many of the parameter types get translated. It is however limited by how well I am able to glean the correct native type from the header files. For example the hDoc parameter used in the preceding function was defined as a non-filesytem handle, so attempted to use Marshal to create a handle, then used an IntPtr, and finally it turned out to be an int (actually it was &phDoc used here). So is there a more scientific way of doing this, other than trial and error? Jim

    Read the article

  • Inserting a string array as a row into an Excel document using the Open XML SDK 2.0

    - by Sam
    The code runs, but corrupts my excel document. Any help would be mucho appreciated! I used this as a reference. public void AddRow(string fileName, string[] values) { using (SpreadsheetDocument doc = SpreadsheetDocument.Open(fileName, true)) { SharedStringTablePart sharedStringPart = GetSharedStringPart(doc); WorksheetPart worksheetPart = doc.WorkbookPart.WorksheetParts.First(); uint rowIdx = AppendRow(worksheetPart); for (int i = 0; i < values.Length; ++i) { int stringIdx = InsertSharedString(values[i], sharedStringPart); Cell cell = InsertCell(i, rowIdx, worksheetPart); cell.CellValue = new CellValue(stringIdx.ToString()); cell.DataType = new EnumValue<CellValues>( CellValues.SharedString); worksheetPart.Worksheet.Save(); } } } private SharedStringTablePart GetSharedStringPart( SpreadsheetDocument doc) { if (doc.WorkbookPart. GetPartsCountOfType<SharedStringTablePart>() > 0) return doc.WorkbookPart. GetPartsOfType<SharedStringTablePart>().First(); else return doc.WorkbookPart. AddNewPart<SharedStringTablePart>(); } private uint AppendRow(WorksheetPart worksheetPart) { SheetData sheetData = worksheetPart.Worksheet. GetFirstChild<SheetData>(); uint rowIndex = (uint)sheetData.Elements<Row>().Count(); Row row = new Row() { RowIndex = rowIndex }; sheetData.Append(row); return rowIndex; } private int InsertSharedString(string s, SharedStringTablePart sharedStringPart) { if (sharedStringPart.SharedStringTable == null) sharedStringPart.SharedStringTable = new SharedStringTable(); int i = 0; foreach (SharedStringItem item in sharedStringPart.SharedStringTable. Elements<SharedStringItem>()) { if (item.InnerText == s) return i; ++i; } sharedStringPart.SharedStringTable.AppendChild( new Text(s)); sharedStringPart.SharedStringTable.Save(); return i; } private Cell InsertCell(int i, uint rowIdx, WorksheetPart worksheetPart) { SheetData sheetData = worksheetPart.Worksheet. GetFirstChild<SheetData>(); string cellReference = AlphabetMap.Instance[i] + rowIdx; Cell cell = new Cell() { CellReference = cellReference }; Row row = sheetData.Elements<Row>().ElementAt((int)rowIdx); row.InsertAt(cell, i); worksheetPart.Worksheet.Save(); return cell; }

    Read the article

  • Flex DataGrid not Displaying Data

    - by asawilliams
    I have a custom dataGrid that acts more like a 2D list (if that makes sense at all). I am dynamically creating the columns and recreating the dataProvider to suit my needs. While debugging I can see that I am creating the columns and setting them to the dataGrid and creating and setting the dataProvider, but for some reason I am able to see the dataGrid and the columns but not the data. [Bindable] private var mockData:ArrayCollection = new ArrayCollection([ {value: "425341*"}, {value: "425341*"}, {value: "425341*"}, {value: "425341*"}, {value: "425341*"}, {value: "425341*"}, {value: "425341*"}, {value: "425341*"}, {value: "425341*"}]); ... <TwoDimensionalList width="100%" height="100%" numColumns="7" numRows="7" dataField="value" dataProvider="{mockData}"/> The snippet of code from the object only contains the important functions, everything else is not important (getters, setters, etc ...). While debugging I checked all the variables in the functions to make sure they were assigned and a value that I was expecting. [Snippet] public class TwoDimensionalList extends DataGrid { ... /** * Creates and sets properties of columns */ private function createColumns():void { var column:DataGridColumn; var cols:Array = this.columns; for(var i:uint=0; i < numColumns; i++) { column = new DataGridColumn("column"+i.toString()); column.dataField = "column"+i.toString(); cols[i]=column; } this.columns = cols; // go to the current gotoPage(this._currentPageNum); } /** * Sets the data for the dataprovider based on the * number of columns, rows, and page number. * * @param pageNum the page number of the data to be viewed in the DataGrids. */ private function gotoPage(pageNum:uint):void { _currentPageNum = pageNum; var startIndex:uint = (pageNum-1)*numColumns*numRows; var data:ArrayCollection = new ArrayCollection(); for(var i:uint=0; i < numRows; i++) { var obj:Object = new Object(); for(var j:uint=0; j < numColumns; j++) { if((i+(j*numRows)+startIndex) < _dataProvider.length) obj["column"+j.toString()] = String(_dataProvider.getItemAt(i+(j*numRows)+startIndex)["value"]); } data.addItem(obj); } this.dataProvider = data; } }

    Read the article

  • How can I simulate this application hang scenario?

    - by Pwninstein
    I have a Windows Forms app that itself launches different threads to do different kinds of work. Occasionally, ALL threads (including the UI thread) become frozen, and my app becomes unresponsive. I've decided it may be a Garbage Collector-related issue, as the GC will freeze all managed threads temporarily. To verify that just managed threads are frozen, I spin up an unmanaged one that writes to a "heartbeat" file with a timestamp every second, and it is not affected (i.e. it still runs): public delegate void ThreadProc(); [DllImport("UnmanagedTest.dll", EntryPoint = "MyUnmanagedFunction")] public static extern void MyUnmanagedFunction(); [DllImport("kernel32")] public static extern IntPtr CreateThread( IntPtr lpThreadAttributes, uint dwStackSize, IntPtr lpStartAddress, IntPtr lpParameter, uint dwCreationFlags, out uint dwThreadId); uint threadId; ThreadProc proc = new ThreadProc(MyUnmanagedFunction); IntPtr functionPointer = Marshal.GetFunctionPointerForDelegate(proc); IntPtr threadHandle = CreateThread(IntPtr.Zero, 0, functionPointer, IntPtr.Zero, 0, out threadId); My Question is: how can I simulate this situation, where all managed threads are suspended but unmanaged ones keep on spinning? My first stab: private void button1_Click(object sender, EventArgs e) { Thread t = new Thread(new ThreadStart(delegate { new Hanger(); GC.Collect(2, GCCollectionMode.Forced); })); t.Start(); } class Hanger{ private int[] m_Integers = new int[10000000]; public Hanger() { } ~Hanger() { Console.WriteLine("About to hang..."); //This doesn't reproduce the desired behavior //while (true) ; //Neither does this //Thread.Sleep(System.Threading.Timeout.Infinite); } } Thanks in advance!!

    Read the article

  • Global Hotkey in Mono and Gtk#

    - by Zach Johnson
    I'm trying to get a global hotkey working in Linux using Mono. I found the signatures of XGrabKey and XUngrabKey, but I can't seem to get them working. Whenever I try to invoke XGrabKey, the application crashes with a SIGSEGV. This is what I have so far: using System; using Gtk; using System.Runtime.InteropServices; namespace GTKTest { class MainClass { const int GrabModeAsync = 1; public static void Main(string[] args) { Application.Init(); MainWindow win = new MainWindow(); win.Show(); // Crashes here XGrabKey( win.Display.Handle, (int)Gdk.Key.A, (uint)KeyMasks.ShiftMask, win.Handle, true, GrabModeAsync, GrabModeAsync); Application.Run(); XUngrabKey( win.Display.Handle, (int)Gdk.Key.A, (uint)KeyMasks.ShiftMask, win.Handle); } [DllImport("libX11")] internal static extern int XGrabKey( IntPtr display, int keycode, uint modifiers, IntPtr grab_window, bool owner_events, int pointer_mode, int keyboard_mode); [DllImport("libX11")] internal static extern int XUngrabKey( IntPtr display, int keycode, uint modifiers, IntPtr grab_window); } public enum KeyMasks { ShiftMask = (1 << 0), LockMask = (1 << 1), ControlMask = (1 << 2), Mod1Mask = (1 << 3), Mod2Mask = (1 << 4), Mod3Mask = (1 << 5), Mod4Mask = (1 << 6), Mod5Mask = (1 << 7) } } Does anyone have a working example of XGrabKey? Thanks!

    Read the article

  • Flash gets XML but the values are wrong, as3

    - by VideoDnd
    Flash receives the XML, but the values are wrong. How do I fix this? Problem I can see the XML loaded with no errors, but my output is way off. It's as though it's not receiving any values. Numbers in the output window and animation move rapidly. The Flash file runs as if it's variables where set to zero. I changed the order of my code, but that didn't help with this. Please explain how I can correct this. SWF //load xml var myXML:XML; var myLoader:URLLoader = new URLLoader(); myLoader.load(new URLRequest("xml.xml")); myLoader.addEventListener(Event.COMPLETE, processXML); //parse XML function processXML(e:Event):void { myXML = new XML(e.target.data); trace(myXML); //receive values from XML delay = parseInt(myXML.DELAY.text()); trace(delay); repeat = parseInt(myXML.REPEAT.text()); trace(repeat); } //variables var delay:uint = 0; var repeat:uint = 0; //timer and event var timer:Timer = new Timer(uint(delay),uint(repeat)); timer.addEventListener(TimerEvent.TIMER, countdown); //counter function countdown(event:TimerEvent) { myText.text = String(0 + timer.currentCount); trace(0 + timer.currentCount); } timer.start(); XML <?xml version="1.0" encoding="utf-8"?> <SESSION> <DELAY TITLE="starting position">1000</DELAY> <REPEAT TITLE="starting position">60</REPEAT> </SESSION>

    Read the article

  • c++ d3d hooking - COM vtable

    - by Mango
    Trying to make a Fraps type program. See comment for where it fails. #include "precompiled.h" typedef IDirect3D9* (STDMETHODCALLTYPE* Direct3DCreate9_t)(UINT SDKVersion); Direct3DCreate9_t RealDirect3DCreate9 = NULL; typedef HRESULT (STDMETHODCALLTYPE* CreateDevice_t)(UINT Adapter, D3DDEVTYPE DeviceType, HWND hFocusWindow, DWORD BehaviorFlags, D3DPRESENT_PARAMETERS* pPresentationParameters, IDirect3DDevice9** ppReturnedDeviceInterface); CreateDevice_t RealD3D9CreateDevice = NULL; HRESULT STDMETHODCALLTYPE HookedD3D9CreateDevice(UINT Adapter, D3DDEVTYPE DeviceType, HWND hFocusWindow, DWORD BehaviorFlags, D3DPRESENT_PARAMETERS* pPresentationParameters, IDirect3DDevice9** ppReturnedDeviceInterface) { // this call makes it jump to HookedDirect3DCreate9 and crashes. i'm doing something wrong HRESULT ret = RealD3D9CreateDevice(Adapter, DeviceType, hFocusWindow, BehaviorFlags, pPresentationParameters, ppReturnedDeviceInterface); return ret; } IDirect3D9* STDMETHODCALLTYPE HookedDirect3DCreate9(UINT SDKVersion) { MessageBox(0, L"Creating d3d", L"", 0); IDirect3D9* d3d = RealDirect3DCreate9(SDKVersion); UINT_PTR* pVTable = (UINT_PTR*)(*((UINT_PTR*)d3d)); RealD3D9CreateDevice = (CreateDevice_t)pVTable[16]; DetourTransactionBegin(); DetourUpdateThread(GetCurrentThread()); DetourAttach(&(PVOID&)RealD3D9CreateDevice, HookedD3D9CreateDevice); if (DetourTransactionCommit() != ERROR_SUCCESS) { MessageBox(0, L"failed to create createdev hook", L"", 0); } return d3d; } bool APIENTRY DllMain(HINSTANCE hModule, DWORD fdwReason, LPVOID lpReserved) { if (fdwReason == DLL_PROCESS_ATTACH) { MessageBox(0, L"", L"", 0); RealDirect3DCreate9 = (Direct3DCreate9_t)GetProcAddress(GetModuleHandle(L"d3d9.dll"), "Direct3DCreate9"); DetourTransactionBegin(); DetourUpdateThread(GetCurrentThread()); DetourAttach(&(PVOID&)RealDirect3DCreate9, HookedDirect3DCreate9); DetourTransactionCommit(); } // TODO detach hooks return true; }

    Read the article

  • Large flags enumerations in C#

    - by LorenVS
    Hey everyone, got a quick question that I can't seem to find anything about... I'm working on a project that requires flag enumerations with a large number of flags (up to 40-ish), and I don't really feel like typing in the exact mask for each enumeration value: public enum MyEnumeration : ulong { Flag1 = 1, Flag2 = 2, Flag3 = 4, Flag4 = 8, Flag5 = 16, // ... Flag16 = 65536, Flag17 = 65536 * 2, Flag18 = 65536 * 4, Flag19 = 65536 * 8, // ... Flag32 = 65536 * 65536, Flag33 = 65536 * 65536 * 2 // right about here I start to get really pissed off } Moreover, I'm also hoping that there is an easy(ier) way for me to control the actual arrangement of bits on different endian machines, since these values will eventually be serialized over a network: public enum MyEnumeration : uint { Flag1 = 1, // BIG: 0x00000001, LITTLE:0x01000000 Flag2 = 2, // BIG: 0x00000002, LITTLE:0x02000000 Flag3 = 4, // BIG: 0x00000004, LITTLE:0x03000000 // ... Flag9 = 256, // BIG: 0x00000010, LITTLE:0x10000000 Flag10 = 512, // BIG: 0x00000011, LITTLE:0x11000000 Flag11 = 1024 // BIG: 0x00000012, LITTLE:0x12000000 } So, I'm kind of wondering if there is some cool way I can set my enumerations up like: public enum MyEnumeration : uint { Flag1 = flag(1), // BOTH: 0x80000000 Flag2 = flag(2), // BOTH: 0x40000000 Flag3 = flag(3), // BOTH: 0x20000000 // ... Flag9 = flag(9), // BOTH: 0x00800000 } What I've Tried: // this won't work because Math.Pow returns double // and because C# requires constants for enum values public enum MyEnumeration : uint { Flag1 = Math.Pow(2, 0), Flag2 = Math.Pow(2, 1) } // this won't work because C# requires constants for enum values public enum MyEnumeration : uint { Flag1 = Masks.MyCustomerBitmaskGeneratingFunction(0) } // this is my best solution so far, but is definitely // quite clunkie public struct EnumWrapper<TEnum> where TEnum { private BitVector32 vector; public bool this[TEnum index] { // returns whether the index-th bit is set in vector } // all sorts of overriding using TEnum as args } Just wondering if anyone has any cool ideas, thanks!

    Read the article

  • IComparer for integers with and empty strings at end

    - by paulio
    Hi, I've written the following IComparer but I need some help. I'm trying to sort a list of numbers but some of the numbers may not have been filled in. I want these numbers to be sent to the end of the list at all times.. for example... [EMPTY], 1, [EMPTY], 3, 2 would become... 1, 2, 3, [EMPTY], [EMPTY] and reversed this would become... 3, 2, 1, [EMPTY], [EMPTY] Any ideas? public int Compare(ListViewItem x, ListViewItem y) { int comparison = int.MinValue; ListViewItem.ListViewSubItem itemOne = x.SubItems[subItemIndex]; ListViewItem.ListViewSubItem itemTwo = y.SubItems[subItemIndex]; if (!string.IsNullOrEmpty(itemOne.Text) && !string.IsNullOrEmpty(itemTwo.Text)) { uint itemOneComparison = uint.Parse(itemOne.Text); uint itemTwoComparison = uint.Parse(itemTwo.Text); comparison = itemOneComparison.CompareTo(itemTwoComparison); } else { // ALWAYS SEND TO BOTTOM/END OF LIST. } // Calculate correct return value based on object comparison. if (OrderOfSort == SortOrder.Descending) { // Descending sort is selected, return negative result of compare operation. comparison = (-comparison); } else if (OrderOfSort == SortOrder.None) { // Return '0' to indicate they are equal. comparison = 0; } return comparison; } Cheers.

    Read the article

  • IComparer for integers and force empty strings to end

    - by paulio
    Hi, I've written the following IComparer but I need some help. I'm trying to sort a list of numbers but some of the numbers may not have been filled in. I want these numbers to be sent to the end of the list at all times.. for example... [EMPTY], 1, [EMPTY], 3, 2 would become... 1, 2, 3, [EMPTY], [EMPTY] and reversed this would become... 3, 2, 1, [EMPTY], [EMPTY] Any ideas? public int Compare(ListViewItem x, ListViewItem y) { int comparison = int.MinValue; ListViewItem.ListViewSubItem itemOne = x.SubItems[subItemIndex]; ListViewItem.ListViewSubItem itemTwo = y.SubItems[subItemIndex]; if (!string.IsNullOrEmpty(itemOne.Text) && !string.IsNullOrEmpty(itemTwo.Text)) { uint itemOneComparison = uint.Parse(itemOne.Text); uint itemTwoComparison = uint.Parse(itemTwo.Text); comparison = itemOneComparison.CompareTo(itemTwoComparison); } else { // ALWAYS SEND TO BOTTOM/END OF LIST. } // Calculate correct return value based on object comparison. if (OrderOfSort == SortOrder.Descending) { // Descending sort is selected, return negative result of compare operation. comparison = (-comparison); } else if (OrderOfSort == SortOrder.None) { // Return '0' to indicate they are equal. comparison = 0; } return comparison; } Cheers.

    Read the article

  • my dialog box did not show up whene i compile it using sdk 7.1,

    - by zirek
    hello and welcom everyone .. i'd like to build a win32 application using sdk 7.1, i create the dialog box using visual c++ 2012 resource editor, i copy resource.rc and resource.h to my folder and i write this simple main.cpp file: #include <windowsx.h> #include <Windows.h> #include <tchar.h> #include "resource.h" #define my_PROCESS_MESSAGE(hWnd, message, fn) \ case(message): \ return( \ SetDlgMsgResult(hWnd, uMsg, \ HANDLE_##message((hWnd), (wParam), (lParam), (fn)) )) \ LRESULT CALLBACK DlgProc(HWND, UINT, WPARAM, LPARAM); BOOL Cls_OnInitDialog(HWND hwnd, HWND hwndFocus, LPARAM lParam); void Cls_OnCommand(HWND hwnd, int id, HWND hwndCtl, UINT codeNotify); int WINAPI _tWinMain( HINSTANCE hInstance, HINSTANCE, LPTSTR, int iCmdLine ) { DialogBoxParam( hInstance, MAKEINTRESOURCE(IDD_INJECTOR), NULL, (DLGPROC) DlgProc, NULL ); return FALSE; } LRESULT CALLBACK DlgProc( HWND hwnd, UINT uMsg, WPARAM wParam, LPARAM lParam ) { switch (uMsg) { my_PROCESS_MESSAGE(hwnd, WM_INITDIALOG, Cls_OnInitDialog); my_PROCESS_MESSAGE(hwnd, WM_COMMAND, Cls_OnCommand); default: break; } return DefWindowProc(hwnd, uMsg, wParam, lParam); } BOOL Cls_OnInitDialog(HWND hwnd, HWND hwndFocus, LPARAM lParam) { return TRUE; } void Cls_OnCommand(HWND hwnd, int id, HWND hwndCtl, UINT codeNotify) { switch(id) { case IDCANCEL: EndDialog(hwnd, id); break; default: break; } } then i use the following command line to compile my code, wich i found on this forum cl main.cpp /link /SUBSYSTEM:WINDOWS user32.lib my problem is that my dialog box did not show up, and when i use procexp, to see what happen, i found that that my application is created then closed in the same time, and what make me wondering is that its working fine on visual c++ 2012. my sdk 7.1, installed correctly, i testing it against a basic window without any resource file any ideas, ill be really thankful Best, Zirek

    Read the article

  • using WrapCompressedRTFStream in C#

    - by Code Smack
    Hello, I am rewording this question: Csharp C# visual studio 2008 How do I use the WrapCompressedRTFStream when using DLLImport with mapi32.dll? Sample of code to import the WrapCompressedRTFStream method. (I found this, I did not figure this part out) [DllImport("Mapi32.dll", PreserveSig = true)] private static extern void WrapCompressedRTFStream( [MarshalAs(UnmanagedType.Interface)] UCOMIStream lpCompressedRTFStream, uint ulflags, [MarshalAs(UnmanagedType.Interface)] out UCOMIStream lpUncompressedRTFStream ); public const uint MAPI_MODIFY = 0x00000001; public const uint STORE_UNCOMPRESSED_RTF = 0x00008000; How do I use this in my c# application when my compressedRichText is stored in a string.

    Read the article

  • DirectX 10 Primitive is not displayed

    - by pypmannetjies
    I am trying to write my first DirectX 10 program that displays a triangle. Everything compiles fine, and the render function is called, since the background changes to black. However, the triangle I'm trying to draw with a triangle strip primitive is not displayed at all. The Initialization function: bool InitDirect3D(HWND hWnd, int width, int height) { //****** D3DDevice and SwapChain *****// DXGI_SWAP_CHAIN_DESC swapChainDesc; ZeroMemory(&swapChainDesc, sizeof(swapChainDesc)); swapChainDesc.BufferCount = 1; swapChainDesc.BufferDesc.Width = width; swapChainDesc.BufferDesc.Height = height; swapChainDesc.BufferDesc.Format = DXGI_FORMAT_R8G8B8A8_UNORM; swapChainDesc.BufferDesc.RefreshRate.Numerator = 60; swapChainDesc.BufferDesc.RefreshRate.Denominator = 1; swapChainDesc.BufferUsage = DXGI_USAGE_RENDER_TARGET_OUTPUT; swapChainDesc.OutputWindow = hWnd; swapChainDesc.SampleDesc.Count = 1; swapChainDesc.SampleDesc.Quality = 0; swapChainDesc.Windowed = TRUE; if (FAILED(D3D10CreateDeviceAndSwapChain( NULL, D3D10_DRIVER_TYPE_HARDWARE, NULL, 0, D3D10_SDK_VERSION, &swapChainDesc, &pSwapChain, &pD3DDevice))) return fatalError(TEXT("Hardware does not support DirectX 10!")); //***** Shader *****// if (FAILED(D3DX10CreateEffectFromFile( TEXT("basicEffect.fx"), NULL, NULL, "fx_4_0", D3D10_SHADER_ENABLE_STRICTNESS, 0, pD3DDevice, NULL, NULL, &pBasicEffect, NULL, NULL))) return fatalError(TEXT("Could not load effect file!")); pBasicTechnique = pBasicEffect->GetTechniqueByName("Render"); pViewMatrixEffectVariable = pBasicEffect->GetVariableByName( "View" )->AsMatrix(); pProjectionMatrixEffectVariable = pBasicEffect->GetVariableByName( "Projection" )->AsMatrix(); pWorldMatrixEffectVariable = pBasicEffect->GetVariableByName( "World" )->AsMatrix(); //***** Input Assembly Stage *****// D3D10_INPUT_ELEMENT_DESC layout[] = { {"POSITION", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0, D3D10_INPUT_PER_VERTEX_DATA, 0}, {"COLOR", 0, DXGI_FORMAT_R32G32B32A32_FLOAT, 0, 12, D3D10_INPUT_PER_VERTEX_DATA, 0} }; UINT numElements = 2; D3D10_PASS_DESC PassDesc; pBasicTechnique->GetPassByIndex(0)->GetDesc(&PassDesc); if (FAILED( pD3DDevice->CreateInputLayout( layout, numElements, PassDesc.pIAInputSignature, PassDesc.IAInputSignatureSize, &pVertexLayout))) return fatalError(TEXT("Could not create Input Layout.")); pD3DDevice->IASetInputLayout( pVertexLayout ); //***** Vertex buffer *****// UINT numVertices = 100; D3D10_BUFFER_DESC bd; bd.Usage = D3D10_USAGE_DYNAMIC; bd.ByteWidth = sizeof(vertex) * numVertices; bd.BindFlags = D3D10_BIND_VERTEX_BUFFER; bd.CPUAccessFlags = D3D10_CPU_ACCESS_WRITE; bd.MiscFlags = 0; if (FAILED(pD3DDevice->CreateBuffer(&bd, NULL, &pVertexBuffer))) return fatalError(TEXT("Could not create vertex buffer!"));; UINT stride = sizeof(vertex); UINT offset = 0; pD3DDevice->IASetVertexBuffers( 0, 1, &pVertexBuffer, &stride, &offset ); //***** Rasterizer *****// // Set the viewport viewPort.Width = width; viewPort.Height = height; viewPort.MinDepth = 0.0f; viewPort.MaxDepth = 1.0f; viewPort.TopLeftX = 0; viewPort.TopLeftY = 0; pD3DDevice->RSSetViewports(1, &viewPort); D3D10_RASTERIZER_DESC rasterizerState; rasterizerState.CullMode = D3D10_CULL_NONE; rasterizerState.FillMode = D3D10_FILL_SOLID; rasterizerState.FrontCounterClockwise = true; rasterizerState.DepthBias = false; rasterizerState.DepthBiasClamp = 0; rasterizerState.SlopeScaledDepthBias = 0; rasterizerState.DepthClipEnable = true; rasterizerState.ScissorEnable = false; rasterizerState.MultisampleEnable = false; rasterizerState.AntialiasedLineEnable = true; ID3D10RasterizerState* pRS; pD3DDevice->CreateRasterizerState(&rasterizerState, &pRS); pD3DDevice->RSSetState(pRS); //***** Output Merger *****// // Get the back buffer from the swapchain ID3D10Texture2D *pBackBuffer; if (FAILED(pSwapChain->GetBuffer(0, __uuidof(ID3D10Texture2D), (LPVOID*)&pBackBuffer))) return fatalError(TEXT("Could not get back buffer.")); // create the render target view if (FAILED(pD3DDevice->CreateRenderTargetView(pBackBuffer, NULL, &pRenderTargetView))) return fatalError(TEXT("Could not create the render target view.")); // release the back buffer pBackBuffer->Release(); // set the render target pD3DDevice->OMSetRenderTargets(1, &pRenderTargetView, NULL); return true; } The render function: void Render() { if (pD3DDevice != NULL) { pD3DDevice->ClearRenderTargetView(pRenderTargetView, D3DXCOLOR(0.0f, 0.0f, 0.0f, 0.0f)); //create world matrix static float r; D3DXMATRIX w; D3DXMatrixIdentity(&w); D3DXMatrixRotationY(&w, r); r += 0.001f; //set effect matrices pWorldMatrixEffectVariable->SetMatrix(w); pViewMatrixEffectVariable->SetMatrix(viewMatrix); pProjectionMatrixEffectVariable->SetMatrix(projectionMatrix); //fill vertex buffer with vertices UINT numVertices = 3; vertex* v = NULL; //lock vertex buffer for CPU use pVertexBuffer->Map(D3D10_MAP_WRITE_DISCARD, 0, (void**) &v ); v[0] = vertex( D3DXVECTOR3(-1,-1,0), D3DXVECTOR4(1,0,0,1) ); v[1] = vertex( D3DXVECTOR3(0,1,0), D3DXVECTOR4(0,1,0,1) ); v[2] = vertex( D3DXVECTOR3(1,-1,0), D3DXVECTOR4(0,0,1,1) ); pVertexBuffer->Unmap(); // Set primitive topology pD3DDevice->IASetPrimitiveTopology( D3D10_PRIMITIVE_TOPOLOGY_TRIANGLESTRIP ); //get technique desc D3D10_TECHNIQUE_DESC techDesc; pBasicTechnique->GetDesc(&techDesc); for(UINT p = 0; p < techDesc.Passes; ++p) { //apply technique pBasicTechnique->GetPassByIndex(p)->Apply(0); //draw pD3DDevice->Draw(numVertices, 0); } pSwapChain->Present(0,0); } }

    Read the article

  • GetDC() DllImport for x64 apps

    - by devdept
    If you make a little research on the internet you'll see many DLLImport styles for this user32.dll function: HDC GetDC(HWND hWnd); The question is: what type is more appropriate for .NET x64 apps (either compiled with the Platform target as AnyCPU on a x64 machine or specifically as x64)? IntPtr for example grows to a size of 8 on a x64 process, can this be a problem? Is uint more appropriate than Uint64? What is the size of the pointers this function uses when used in a x64 process? The DLL is called user32.dll does it work as 32bit or 64bit on a x64 operating system? [DllImport("user32.dll",EntryPoint="GetDC")] public static extern IntPtr GetDC(IntPtr hWnd); public static extern uint GetDC(uint hWnd); public static extern int GetDC(int hWnd); Thanks!

    Read the article

  • pointer to member function question

    - by Steve
    Hello, I'm trying to replicate a template I've used before with a member function, and it isn't going very well. The basic form of the function is template<class T> T Convert( HRESULT (*Foo)(T*)) { T temp; Foo(&temp); //Throw if HRESULT is a failure return temp; } HRESULT Converter(UINT* val) { *val = 1; return S_OK; } int _tmain(int argc, _TCHAR* argv[]) { std::cout << Convert<UINT>(Converter) << std::endl; return 0; } For the life of me, I can't get this to work with a member variable. I've read up on their syntax, and I can't seem to figure out how to make it work with templates. The class would be something similar to class TestClass { HRESULT Converter(UINT* val) { *val = 1; return S_OK; } }

    Read the article

  • How can I access and change elements in this private readonly property?

    - by CrimsonX
    I'm trying to figure out how I am able to successfully change a "readonly" array. I have this: namespace ConsoleApplication1 { class Program { static void Main(string[] args) { MyClass myClass = new MyClass(); myClass.Time[5] = 5; // How is this legal? } } public class MyClass { private readonly uint[] time; public IList<uint> Time { get { return time; } } public MyClass() { time = new uint[7]; } } } As I Note above, I would expect that Time[5] would be illegal due to the fact that public IList Time does not have a setter. Additionally, how can I create an array in the constructor which is read-only and unchangeable outside of this class?

    Read the article

  • Accessing "Mapi32.dll" with C#. [not solved]

    - by Code Smack
    Hello, I am using VS 2008 C# Windows Application. I have this DLL Import I am trying to use. [DllImport("Mapi32.dll", PreserveSig = true)] private static extern void WrapCompressedRTFStream( [MarshalAs(UnmanagedType.Interface)] UCOMIStream lpCompressedRTFStream, uint ulflags, [MarshalAs(UnmanagedType.Interface)] out UCOMIStream lpUncompressedRTFStream ); public const uint MAPI_MODIFY = 0x00000001; public const uint STORE_UNCOMPRESSED_RTF = 0x00008000; I have a compressed string that is in CompressedRFTFormat. How do I pass the string into the WrapCompressedRTFStream? I do not understand what the method is expecting. I am trying to use it on a button. RichText1.text = WrapCompressedRTFStream(_CompressedRichText.ToString(),something,somethingelse); The first error I get is "cannot convert from 'string' to 'System.Runtime.InteropServices.UCOMIStream" I hope someone who understands this posts an answer that helps!

    Read the article

  • Click a button in another application

    - by sam
    I want yo use SendMessage or PostMessage to press a button in another app i have a sample code to do this but by getting Window Handle, but don't work also i used "WinDowse" to get required info. here is the code private const uint BM_CLICK = 0x00F5; private const uint WM_LBUTTONDOWN = 0x0201; private const uint WM_LBUTTONUP = 0x0202; private void PushOKButton(IntPtr ptrWindow) { WindowHandle = FindWindow(null, "Form1"); if (ptrWindow == IntPtr.Zero) return; IntPtr ptrOKButton = FindWindowEx(ptrWindow, IntPtr.Zero, "Button", "&Yes"); if (ptrOKButton == IntPtr.Zero) return; SendMessage(ptrOKButton, WM_LBUTTONDOWN, 0, 0); SendMessage(ptrOKButton, WM_LBUTTONUP, 0, 0); SendMessage(ptrOKButton, BM_CLICK, 0, 0); } is There a Compelete Suloution in c# ?

    Read the article

  • An issue with tessellation a model with DirectX11

    - by Paul Ske
    I took the hardware tessellation tutorial from Rastertek and implemended texturing instead of color. This is great, so I wanted to implemended the same techique to a model inside my game editor and I noticed it doesn't draw anything. I compared the detailed tessellation from DirectX SDK sample. Inside the shader file - if I replace the HullInputType with PixelInputType it draws. So, I think because when I compiled the shaders inside the program it compiles VertexShader, PixelShader, HullShader then DomainShader. Isn't it suppose to be VertexShader, HullSHader, DomainShader then PixelShader or does it really not matter? I am just curious why wouldn't the model even be drawn when HullInputType but renders fine with PixelInputType. Shader Code: [code] cbuffer ConstantBuffer { float4x4 WVP; float4x4 World; // the rotation matrix float3 lightvec; // the light's vector float4 lightcol; // the light's color float4 ambientcol; // the ambient light's color bool isSelected; } cbuffer cameraBuffer { float3 cameraDirection; float padding; } cbuffer TessellationBuffer { float tessellationAmount; float3 padding2; } struct ConstantOutputType { float edges[3] : SV_TessFactor; float inside : SV_InsideTessFactor; }; Texture2D Texture; Texture2D NormalTexture; SamplerState ss { MinLOD = 5.0f; MipLODBias = 0.0f; }; struct HullOutputType { float3 position : POSITION; float2 texcoord : TEXCOORD0; float3 normal : NORMAL; float3 tangent : TANGENT; }; struct HullInputType { float4 position : POSITION; float2 texcoord : TEXCOORD0; float3 normal : NORMAL; float3 tangent : TANGENT; }; struct VertexInputType { float4 position : POSITION; float2 texcoord : TEXCOORD; float3 normal : NORMAL; float3 tangent : TANGENT; uint uVertexID : SV_VERTEXID; }; struct PixelInputType { float4 position : SV_POSITION; float2 texcoord : TEXCOORD0; // texture coordinates float3 normal : NORMAL; float3 tangent : TANGENT; float4 color : COLOR; float3 viewDirection : TEXCOORD1; float4 depthBuffer : TEXTURE0; }; HullInputType VShader(VertexInputType input) { HullInputType output; output.position.w = 1.0f; output.position = mul(input.position,WVP); output.texcoord = input.texcoord; output.normal = input.normal; output.tangent = input.tangent; //output.normal = mul(normal,World); //output.tangent = mul(tangent,World); //output.color = output.color; //output.texcoord = texcoord; // set the texture coordinates, unmodified return output; } ConstantOutputType TexturePatchConstantFunction(InputPatch inputPatch,uint patchID : SV_PrimitiveID) { ConstantOutputType output; output.edges[0] = tessellationAmount; output.edges[1] = tessellationAmount; output.edges[2] = tessellationAmount; output.inside = tessellationAmount; return output; } [domain("tri")] [partitioning("integer")] [outputtopology("triangle_cw")] [outputcontrolpoints(3)] [patchconstantfunc("TexturePatchConstantFunction")] HullOutputType HShader(InputPatch patch, uint pointId : SV_OutputControlPointID, uint patchId : SV_PrimitiveID) { HullOutputType output; // Set the position for this control point as the output position. output.position = patch[pointId].position; // Set the input color as the output color. output.texcoord = patch[pointId].texcoord; output.normal = patch[pointId].normal; output.tangent = patch[pointId].tangent; return output; } [domain("tri")] PixelInputType DShader(ConstantOutputType input, float3 uvwCoord : SV_DomainLocation, const OutputPatch patch) { float3 vertexPosition; float2 uvPosition; float4 worldposition; PixelInputType output; // Interpolate world space position with barycentric coordinates float3 vWorldPos = uvwCoord.x * patch[0].position + uvwCoord.y * patch[1].position + uvwCoord.z * patch[2].position; // Determine the position of the new vertex. vertexPosition = vWorldPos; // Calculate the position of the new vertex against the world, view, and projection matrices. output.position = mul(float4(vertexPosition, 1.0f),WVP); // Send the input color into the pixel shader. output.texcoord = uvwCoord.x * patch[0].position + uvwCoord.y * patch[1].position + uvwCoord.z * patch[2].position; output.normal = uvwCoord.x * patch[0].position + uvwCoord.y * patch[1].position + uvwCoord.z * patch[2].position; output.tangent = uvwCoord.x * patch[0].position + uvwCoord.y * patch[1].position + uvwCoord.z * patch[2].position; //output.depthBuffer = output.position; //output.depthBuffer.w = 1.0f; //worldposition = mul(output.position,WVP); //output.viewDirection = cameraDirection.xyz - worldposition.xyz; //output.viewDirection = normalize(output.viewDirection); return output; } [/code] Somethings are commented out but will be in place when fixed. I'm probably not connecting something correctly.

    Read the article

  • Flixel Game Over Screen

    - by Jamie Read
    I am new to game development but familiar with programming languages. I have started using Flixel and have a working Breakout game with score and lives. I am just stuck on how I can create a new screen/game over screen if a player runs out of lives. I would like the process to be like following: Check IF lives are equal to 0 Pause the game and display a new screen (probably transparent) that says 'Game Over' When a user clicks or hits ENTER restart the level Here is the function I currently have to update the lives: private function loseLive(_ball:FlxObject, _bottomWall:FlxObject):void { // check for game over if (lives_count == 0) { } else { FlxG:lives_count -= 1; lives.text = 'Lives: ' + lives_count.toString() } } Here is my main game.as: package { import org.flixel.*; public class Game extends FlxGame { private const resolution:FlxPoint = new FlxPoint(640, 480); private const zoom:uint = 2; private const fps:uint = 60; public function Game() { super(resolution.x / zoom, resolution.y / zoom, PlayState, zoom); FlxG.flashFramerate = fps; } } }

    Read the article

  • problem with loading in .FBX meshes in DirectX 10

    - by N0xus
    I'm trying to load in meshes into DirectX 10. I've created a bunch of classes that handle it and allow me to call in a mesh with only a single line of code in my main game class. How ever, when I run the program this is what renders: In the debug output window the following errors keep appearing: D3D10: ERROR: ID3D10Device::DrawIndexed: Input Assembler - Vertex Shader linkage error: Signatures between stages are incompatible. The reason is that Semantic 'TEXCOORD' is defined for mismatched hardware registers between the output stage and input stage. [ EXECUTION ERROR #343: DEVICE_SHADER_LINKAGE_REGISTERINDEX ] D3D10: ERROR: ID3D10Device::DrawIndexed: Input Assembler - Vertex Shader linkage error: Signatures between stages are incompatible. The reason is that the input stage requires Semantic/Index (POSITION,0) as input, but it is not provided by the output stage. [ EXECUTION ERROR #342: DEVICE_SHADER_LINKAGE_SEMANTICNAME_NOT_FOUND ] The thing is, I've no idea how to fix this. The code I'm using does work and I've simply brought all of that code into a new project of mine. There are no build errors and this only appears when the game is running The .fx file is as follows: float4x4 matWorld; float4x4 matView; float4x4 matProjection; struct VS_INPUT { float4 Pos:POSITION; float2 TexCoord:TEXCOORD; }; struct PS_INPUT { float4 Pos:SV_POSITION; float2 TexCoord:TEXCOORD; }; Texture2D diffuseTexture; SamplerState diffuseSampler { Filter = MIN_MAG_MIP_POINT; AddressU = WRAP; AddressV = WRAP; }; // // Vertex Shader // PS_INPUT VS( VS_INPUT input ) { PS_INPUT output=(PS_INPUT)0; float4x4 viewProjection=mul(matView,matProjection); float4x4 worldViewProjection=mul(matWorld,viewProjection); output.Pos=mul(input.Pos,worldViewProjection); output.TexCoord=input.TexCoord; return output; } // // Pixel Shader // float4 PS(PS_INPUT input ) : SV_Target { return diffuseTexture.Sample(diffuseSampler,input.TexCoord); //return float4(1.0f,1.0f,1.0f,1.0f); } RasterizerState NoCulling { FILLMODE=SOLID; CULLMODE=NONE; }; technique10 Render { pass P0 { SetVertexShader( CompileShader( vs_4_0, VS() ) ); SetGeometryShader( NULL ); SetPixelShader( CompileShader( ps_4_0, PS() ) ); SetRasterizerState(NoCulling); } } In my game, the .fx file and model are called and set as follows: Loading in shader file //Set the shader flags - BMD DWORD dwShaderFlags = D3D10_SHADER_ENABLE_STRICTNESS; #if defined( DEBUG ) || defined( _DEBUG ) dwShaderFlags |= D3D10_SHADER_DEBUG; #endif ID3D10Blob * pErrorBuffer=NULL; if( FAILED( D3DX10CreateEffectFromFile( TEXT("TransformedTexture.fx" ), NULL, NULL, "fx_4_0", dwShaderFlags, 0, md3dDevice, NULL, NULL, &m_pEffect, &pErrorBuffer, NULL ) ) ) { char * pErrorStr = ( char* )pErrorBuffer->GetBufferPointer(); //If the creation of the Effect fails then a message box will be shown MessageBoxA( NULL, pErrorStr, "Error", MB_OK ); return false; } //Get the technique called Render from the effect, we need this for rendering later on m_pTechnique=m_pEffect->GetTechniqueByName("Render"); //Number of elements in the layout UINT numElements = TexturedLitVertex::layoutSize; //Get the Pass description, we need this to bind the vertex to the pipeline D3D10_PASS_DESC PassDesc; m_pTechnique->GetPassByIndex( 0 )->GetDesc( &PassDesc ); //Create Input layout to describe the incoming buffer to the input assembler if (FAILED(md3dDevice->CreateInputLayout( TexturedLitVertex::layout, numElements,PassDesc.pIAInputSignature, PassDesc.IAInputSignatureSize, &m_pVertexLayout ) ) ) { return false; } model loading: m_pTestRenderable=new CRenderable(); //m_pTestRenderable->create<TexturedVertex>(md3dDevice,8,6,vertices,indices); m_pModelLoader = new CModelLoader(); m_pTestRenderable = m_pModelLoader->loadModelFromFile( md3dDevice,"armoredrecon.fbx" ); m_pGameObjectTest = new CGameObject(); m_pGameObjectTest->setRenderable( m_pTestRenderable ); // Set primitive topology, how are we going to interpet the vertices in the vertex buffer md3dDevice->IASetPrimitiveTopology( D3D10_PRIMITIVE_TOPOLOGY_TRIANGLELIST ); if ( FAILED( D3DX10CreateShaderResourceViewFromFile( md3dDevice, TEXT( "armoredrecon_diff.png" ), NULL, NULL, &m_pTextureShaderResource, NULL ) ) ) { MessageBox( NULL, TEXT( "Can't load Texture" ), TEXT( "Error" ), MB_OK ); return false; } m_pDiffuseTextureVariable = m_pEffect->GetVariableByName( "diffuseTexture" )->AsShaderResource(); m_pDiffuseTextureVariable->SetResource( m_pTextureShaderResource ); Finally, the draw function code: //All drawing will occur between the clear and present m_pViewMatrixVariable->SetMatrix( ( float* )m_matView ); m_pWorldMatrixVariable->SetMatrix( ( float* )m_pGameObjectTest->getWorld() ); //Get the stride(size) of the a vertex, we need this to tell the pipeline the size of one vertex UINT stride = m_pTestRenderable->getStride(); //The offset from start of the buffer to where our vertices are located UINT offset = m_pTestRenderable->getOffset(); ID3D10Buffer * pVB=m_pTestRenderable->getVB(); //Bind the vertex buffer to input assembler stage - md3dDevice->IASetVertexBuffers( 0, 1, &pVB, &stride, &offset ); md3dDevice->IASetIndexBuffer( m_pTestRenderable->getIB(), DXGI_FORMAT_R32_UINT, 0 ); //Get the Description of the technique, we need this in order to loop through each pass in the technique D3D10_TECHNIQUE_DESC techDesc; m_pTechnique->GetDesc( &techDesc ); //Loop through the passes in the technique for( UINT p = 0; p < techDesc.Passes; ++p ) { //Get a pass at current index and apply it m_pTechnique->GetPassByIndex( p )->Apply( 0 ); //Draw call md3dDevice->DrawIndexed(m_pTestRenderable->getNumOfIndices(),0,0); //m_pD3D10Device->Draw(m_pTestRenderable->getNumOfVerts(),0); } Is there anything I've clearly done wrong or are missing? Spent 2 weeks trying to workout what on earth I've done wrong to no avail. Any insight a fresh pair eyes could give on this would be great.

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >