[math-fun] And you think You have problems ...
a few tidbits -- rich ----- From: Balaji Gopalakrishnan <bgopalak@ima.umn.edu> To: dmanet@zpr.uni-koeln.de Subject: [DMANET] Question on Large-Scale LP's Date: Mon, 1 Mar 2004 21:20:18 -0600 (CST) Hi, I have an LP which has 300 million columns and 200 million rows. The basic structure is like the multicommodity flow problem with about 20 linking constraints (completely dense) and rest are block angular. The blocks are all of the same size (4 rows and 7 columns in each block with about 50 million blocks). The variables are all between 0 and 1. Can someone suggest an approach to such ultra large-scale LP problems ? Any reference to papers/articles that discuss solving such large-scale LPP's will be helpful. I am currently exploring some ideas using potential function methods for obtaining approximate solutions for this problem, but the size of problems solved in literature in relatively small when compared to this one and I am not sure how such methods will perform on such problems. Thanks in advance. -Balaji ----- I recently puzzled over the following problem: I have a 1Mbit file; I'm willing to add 100K additional bits for error correction. What should I do? My best guess is to use random parity bits, with each EC bit based on maybe 100 randomly selected bits from the file. There's some hope that an iterative-improvement decoder would converge if the bit error rate was < 1%. Computing the parity bits is slow. ----- Hilarie has suggested that Feb 28 be called Look Day.
I have a 1Mbit file; I'm willing to add 100K additional bits for error correction. What should I do?
Perhaps you could add six bits of error correcting code for each block of 64 bits. That provides room for a code of Hamming-distance three, so you'd get single-error correction in each block. (But double-errors would be corrected wrong.) -- Don Reble djr@nk.ca
participants (2)
-
Don Reble -
Richard Schroeppel