• RSS
  • Twitter
  • FaceBook

Security Forums

Log in

FAQ | Search | Usergroups | Profile | Register | RSS | Posting Guidelines | Recent Posts

Public Key Encryption

Users browsing this topic:0 Security Fans, 0 Stealth Security Fans
Registered Security Fans: None
Goto page 1, 2  Next
Post new topic   Reply to topic   Printer-friendly version    Networking/Security Forums Index -> Cryptographic Theory and Cryptanalysis - Internal and Transmission Security

View previous topic :: View next topic  
Author Message
JustinT
Trusted SF Member
Trusted SF Member


Joined: 17 Apr 2003
Posts: 16777215
Location: Asheville, NC, US / Uberlândia, MG, Brazil

Offline

PostPosted: Thu Apr 14, 2005 12:42 pm    Post subject: Thoughts. Reply with quote

Bungle wrote:

As far as I am aware I think SHA512 is the most secure but you better ask someone else.


Although there isn't a present, imminent, reason to disagree with this to any great extent, the media has largely failed to address the core of the issue. Contrary to what one might presume, given the media's portrayal of the recent months' worth of hash function cryptanalysis, it's not really a matter of use hash functions with larger output lengths; it is, however, really a matter of designing new hash functions that incorporate different strategies. I've written more thoughts in a short memo, here.

Why is this important? The conventional hash functions we've witnessed cryptanalysis of, recently, are all composed of similar strategies; they are based on the design principles of the Unbalanced Feistel Network structure of MD4. Quite a bit of interesting analysis surfaced, for several of these functions, within a short span of only months. Output length may buy us time, but perhaps we should have a variety of strategic routes to take, if science proves the current route incapable of further providing us with the conservative, yet efficient, cryptographic security we need.

huh wrote:

I mean it should be just like wearing layers of clothing to protect your self from the cold, but its not as simple as that.


It does seem as if it should be that simple, indeed! However, mathematics can be rather tricky, at times, in extremely subtle ways. This is why layering requires a meticulous effort to make sure these subtleties are avoided. These subtleties may include things, such as, if keys are related in some way, or perhaps even the primitives themselves.

Realistically, when done correctly, most of the issues surrounding both multiple and cascaded constructions, can likely be addressed and avoided, successfully. On the other hand, this introduces much more complexity, that isn't practically necessary, and if for some reason the architect behind the system isn't cryptographically competent enough, he or she may end up making things insecure.

I suppose, to simplify things a bit, the reason it's "not that simple" is due to the fact that the mathematics of cryptographic primitives can, as aforementioned, behave in rather subtle ways. When we try to do unusual things with primitives, such as using them in constructions that they weren't originally designed for, or in ways that there security models weren't meant to satisfy, specifically, then unusual things can happen.

At that point, we're left with the complexity of dealing with assumptions that essentially require cryptanalysis of their own, respectively. In practical cryptography, we need things to be simple and efficient; it's absolutely vital that we're relying on as few assumptions as possible. While layering, whether it be in multiple or cascaded schemes, has potential benefit, we can do without it, in practice, given the current state of what we know about the security of conventional cryptography.

There's a difference between being conservative and introducing unnecessary complexity, out of sheer paranoia. Allow me to distinguish between good paranoia and bad paranoia. First, bad paranoia, which usually invites the mentality of throwing every inch of cryptographic research into one layered juggernaut of mayhem, is the type that the uninformed summon; they render complexity that their sparse knowledge of cryptography cannot address securely, thus rendering horribly insecure systems.

Good paranoia is a natural part of cryptography. Academic cryptographers are, essentially, mathematical paranoids. After all, we're assuming information is at risk. The usefulness of this paranoia is that it's applicable to cryptographic design. We assume the worst-case scenario is realistic; we assume it occurs frequently. This makes establishing a thread model much more trivial, than if we attempt to specify one threat at a time, making assumptions as to which threats are significant and which threats are insignificant. Using paranoia as a basis for this model is the type of rigorous tactic that a good primitive should remain secure through.

Note, however, for real-world applications of good cryptography, this doesn't mean cobbling together every last measure known to man. Consider a scenario where one does this, though. The more components, the more complexity. The more complex the system, the more complex the implementation faults have the potential to be. In turn, this implies a system where analysis is also complex. In a model based on paranoia, this would contradict what is desired, as complexity gives the adversary more potential for exploitation, that the victim may fail to notice.

So, you see, being paranoid, in the cryptographic sense, isn't about piling on as much cryptography as possible, but making sure that your cryptography is simple enough to analyze, such that it withstands your threat model, in practice! We can do this conservatively, and rather efficiently, which is why I advocate it. Be sure to exercise the right kind of paranoia, as well! Remember, good cryptography is the result of one approach - realizing security by first recognizing insecurity. In other words, understanding how to attack bad cryptography is vital in understanding how to define good cryptography to defend against such attacks.

Simplicity is the pillar of the thorough analysis that is required to make this possible. So, be conservative, but not complex, if it can be helped! After all, if you're paranoid, you want as much assurance as possible, right? Keep that in mind.
Back to top
View user's profile Send private message Visit poster's website
Bungle
Most Paranoid Member!
Most Paranoid Member!


Joined: 03 Feb 2005
Posts: 2


Offline

PostPosted: Thu Apr 14, 2005 2:01 pm    Post subject: Reply with quote

Hi JT Very Happy

Thank you for the link to the PDF. It was very interesting.

Now I am really going to push my luck here and dare I say it ….. I have a small problem with one part of it !! Shocked


Quote:

Perhaps this need for strategic variety requires an initiative. Maybe a design-bycontest; it worked exceptionally well for the AES selection process.


I personally believe they came to the wrong conclusion during this selection process, certainly as far as security goes at least. I believe most people, even yourself would have chosen “Serpent” over Rijndael AES.

Right that’s it I’ve don’t it Sad . I’ve stuck my neck out and criticised JT Shocked !! Oh my God I daren’t log back in Sad !!

Please be gentle with me when you prove me wrong Sad !

Bungle.
Back to top
View user's profile Send private message
data
Forum Fanatic
Forum Fanatic


Joined: 08 May 2004
Posts: 16777211
Location: India

Offline

PostPosted: Thu Apr 14, 2005 2:50 pm    Post subject: Reply with quote

hi Bungle,

Serpent is way slower than Rijndael. This gives a good comparison.

More over this encryption is meant to be used in U.S Federal and government offices perhaps including the databases of the IRS or the census database or some thing very large of that kind. It would be way too slow if they used Serpent. There had to be some tradeoff betrween security and efficiency.

I would also like to quote from Justin's earlier post.

JustinT wrote:
Sure thing. Serpent is quite a conservatively robust choice; much tighter than the other finalists. One cool trait is that it, along with Rijndael, are rather effective when defending against timing attacks, while RC6 is at the low end of that category. With it's juggernaut appeal does come potential software penalties, however, which place it [Serpent] at the bottom of the rung in several software categories. It's much like DES was - nearly as fast and better in hardware. But, for me, the decreased performance is well worth the wider margin of security


Data.
Back to top
View user's profile Send private message Visit poster's website Yahoo Messenger
Bungle
Most Paranoid Member!
Most Paranoid Member!


Joined: 03 Feb 2005
Posts: 2


Offline

PostPosted: Thu Apr 14, 2005 4:57 pm    Post subject: Reply with quote

Hi Datah Very Happy

Quote:

There had to be some tradeoff betrween security and efficiency.


Yes I understand this, thanks. But my point was it depends on who is testing and for what purpose. I believe members here on SFDC believe in security over performance. When JT said that “it worked exceptionally well for the AES selection process.” I was just pointing out that it didn’t as far as security was concerned.

Also I was just trying to get JT worked up a bit Laughing !! I knew he preferred Serpent for security that’s why I said the following.

Quote:

certainly as far as security goes at least. I believe most people, even yourself would have chosen “Serpent” over Rijndael AES.


Thanks Datah. Wink

Bungle.
Back to top
View user's profile Send private message
JustinT
Trusted SF Member
Trusted SF Member


Joined: 17 Apr 2003
Posts: 16777215
Location: Asheville, NC, US / Uberlândia, MG, Brazil

Offline

PostPosted: Thu Apr 14, 2005 11:40 pm    Post subject: More thoughts. Reply with quote

Hehe. No worries. I welcome criticism, as it provides new perspective on ideas. New perspective leads to the opening of new avenues of thoughts, and is beneficial to the learning process of which we will forever be a part of. So, thanks for perusing through the paper and sharing your opinion! It is appreciated.

I suppose we can break this down into two groups. First, you have the standardizing bodies, such as the NIST; second, you have the casual users, such as you and me. When standardizing a primitive, the constraints are much more vital during the process, than what will actually be noticeable to those of us who do not actively use specialized hardware or platforms where time, space, memory, and things of that sort, are of the essence.

Between Rijndael, Twofish, and Serpent, we will likely not find ourselves in a position where the performance really determines which of these we choose; general computer use gives us that luxury. This is why we can place sole precedence on aspects such as conservative security; this is why Twofish and Serpent are the more "conservative conscious" choices. In fact, it is obvious that Rijndael could have been a bit more conservative than it is, in regards to the number of rounds used for a given key length.

However, there are a few things to consider, that you won't find in the media's portrayal of cryptography. This is rather advanced, in terms of knowing the structure of Rijndael, but I'll do my best to make the point as obvious as I can. Quoting myself:

JustinT wrote:

You must be careful when using the notion of "security margin." It's important that you understand how to view it. Obviously, given the ratio of rounds covered by attacks as opposed to the total number of rounds, Serpent has a larger margin of security than Twofish, which has a larger margin of security than Rijndael. However, each of these block ciphers is semantically unique; what takes place during a round transformation is based on unique design strategies which meet particular security goals.

Don't make the mistake of taking things too literally, by comparing apples with oranges. Consider, for example, that two rounds of Rijndael provide a "complete" diffusion effect, in a sense; some block cipher structures only achieve this after three or so rounds. Also, the number of rounds specified may affect the number of rounds necessary through which a propagation trail is to be located, which is a point of exploit for differential and truncated differential cryptanalysis, as well as linear cryptanalysis and even a saturation attack (i.e., structural, Square, et cetera). This is just to hint that a "margin of security" can be specific to the semantics of an individual algorithm, just as it can be the generic notion it is usually taken for.

The generic notion only echoes the obvious, which is the current resilience of the block cipher against known attacks; there is no assurance of resilience against cryptanalytical advances or unknown attacks. Rijndael's security, as it is specified in the AES, does rely on the hope that no further improvements in known attacks will occur, as well as unknown attacks; this applies to any block cipher for which attacks exist, essentially. Based on this generic notion, it is, however, particularly worrisome that the standard carries the smallest margin, given its subjection to perhaps the most rigorous cryptanalysis.

Either way, while it may be reasonable to say that Rijndael, as specified in the AES, has a "less conservative" generic margin of security, or is even more fragile, I don't believe it's justifiable to mark it off as weak, in the practical sense, such that it isn't suitable for the security of most conventional threat models. Its modularity, as with Square, would make it quite trivial to increase the number of rounds, if need-be, in the event that the standard is, or needs to be, altered. If you use the AES, use it because it is the standard; consider the affect on cryptanalysis and implementation efficiency that being a standard invokes.


While Rijndael, as specified in the AES, has a smaller, less conservative, generic margin of security, conservatism was exercised in areas such as establishing more rounds than what is necessary to achieve the full diffusion effect. The point is - the generic model for a margin of security is very loose. Twofish, Serpent, and Rijndael are quite different from one another; it would be inaccurate to assume that they are comparable, round for round, in regards to how many rounds they have versus how many rounds have been successfully attacked for reduced-round variants.

This is simply because of the fact that different things happen during the round transformations of these primitives. So, in short, while the generic model for determining a margin of security is helpful, it's best to take it for what it's worth - a means of suggesting a configuration that makes a given primitive more conservative, as opposed to a way to compare primitives and decide which is better. There's much more to it, than that.

I quite like the design strategies of each of the three block ciphers. Serpent is a product of the type of conservatism I enjoy seeing. Rijndael is an exceptionally simple primitive, with well-organized and segregated components, involved in a strategy that I consider one of the better candidates for trends we should follow in block cipher design, as well as hash function design. Twofish, albeit different from the two, is a nice balance between robustness and efficiency, which makes it a decent choice to advocate. From a basic analysis of their structure, they arrive at decent security, in their respective ways.

Going to back to the two groups - standardizing bodies and us - it's easy to see how I can be much more flexible in my choice. I haven't the constraints to meet; a standard must be flexible enough to meet the constraints of many. The AES selection, "design-by-contest", did work incredibly well, beyond our expectations. While Rijndael may not be as generically robust as we'd have hoped for, as it is specified in the AES, the majority of block cipher cryptanalysis, now, and in the near future, will be applied to its structure.

This is beneficial to the wide-trail strategy (its underlying structure); it's beneficial to us. This increase in cryptanalytical interest is one reason it seems logical to apply this design strategy to hash functions. Overall, it's a good block cipher. Most importantly, along with the increase in analysis that it has seen and will continue to see, its structure is one of inviting simplicity, which makes it's security much easier to justify through cryptanalysis. So, in all actuality, if you consider these things, the AES selection process did decently, in terms of security. Fortunately, Rijndael is modular enough to allow round extensions, if it's ever decided that the AES specification should be revised in a more conservative way.

So, to be more precise, I do prefer the conservatism of Serpent, and the nice trade-off of robustness and efficiency that Twofish provides, but I quite like the design strategy behind Rijndael, and would like to see more of it in new block cipher and hash function designs. It's time to see what we can do, securely, without Feistels.
Back to top
View user's profile Send private message Visit poster's website
Bungle
Most Paranoid Member!
Most Paranoid Member!


Joined: 03 Feb 2005
Posts: 2


Offline

PostPosted: Fri Apr 15, 2005 12:23 pm    Post subject: Reply with quote

Hi JT Very Happy

Quote:

Hehe. No worries. I welcome criticism, as it provides new perspective on ideas. New perspective leads to the opening of new avenues of thoughts, and is beneficial to the learning process of which we will forever be a part of.


Phew, I’m glad you took it light heartedly. I just thought I would stir the crypto gang (JT, Datah and mxb) into action ! Laughing

Quote:

So, thanks for perusing through the paper and sharing your opinion! It is appreciated.


Thank you for providing it. I must admit I didn’t understand it all but I think I got the general idea.

I apologise in advance for compressing your very comprehensive last post down in a very over simplified way, but I was wondering am I correct in saying the following ?

“Serpent and Twofish are probably the most secure at the moment but Rijndael has more potential for improvement in the future.”


Thanks, Wink

Bungle.
Back to top
View user's profile Send private message
huh
Just Arrived
Just Arrived


Joined: 07 Apr 2005
Posts: 0


Offline

PostPosted: Sat Apr 16, 2005 1:12 pm    Post subject: Reply with quote

i dont know what's going on, but i wrote a long reply, spell checked it and inserted into the document and then submitted. But after that it told me to sign back in and everything I wrote has been lost! and this is not the first time either? Why do I have to log in once then login again after submitting then again log in to re-submit?

Basically in a nut shell, thank you for your patience with me and with out derailing from my orginal question and my specific problem, please inform me in laymans term for the moment without a mathmatical formula or code.

-Should I stop layering by encrypting more than once using PGP 8.1? PGP does NOT have ceaser's cipher (ehh i dont think anything does) but this was just an e.g to show the concept.

If so, what alternatives do I have to reinforce the encryption and it susteptibity to attack/crack... should I just rely on one encryptio using PGP using a 4096 size key? Whats the best combination.

OK I must go for now, shame about my long answer that was lost, but thank you again, and Bungle your therad for which you got the award seems like the thing I am looking for I will check it out, but not for the moment.
Back to top
View user's profile Send private message
data
Forum Fanatic
Forum Fanatic


Joined: 08 May 2004
Posts: 16777211
Location: India

Offline

PostPosted: Sat Apr 16, 2005 9:13 pm    Post subject: Reply with quote

hi,

huh wrote:
Why do I have to log in once then login again after submitting then again log in to re-submit?


That must be because your cookie is expiring pretty fast(may be in ten minutes or so.) Copy what you type onto note pad before posting. That should save you a lot of trouble. Thats what I generally do.
Quote:

-Should I stop layering by encrypting more than once using PGP 8.1? PGP does NOT have ceaser's cipher (ehh i dont think anything does) but this was just an e.g to show the concept.


Its better not to do any layering with the algorithms you use in PGP.

Quote:
If so, what alternatives do I have to reinforce the encryption and it susteptibity to attack/crack... should I just rely on one encryptio using PGP using a 4096 size key? Whats the best combination.


That should be a key size for that RSA? You didn't mention the encryption algorithm. I think PGP supports IDEA, its a good symmetric key algorithm to use. Please check their documentation.

Sarad.
Back to top
View user's profile Send private message Visit poster's website Yahoo Messenger
huh
Just Arrived
Just Arrived


Joined: 07 Apr 2005
Posts: 0


Offline

PostPosted: Sun Apr 17, 2005 12:11 pm    Post subject: Reply with quote

thanks datah Very Happy.

OK for now i'l implement this, however once i become a bit more knoweldgable about this and encryption i would like to pursue it further and also understand it...so watch this space Rolling Eyes Arrow

thank you yet again
Back to top
View user's profile Send private message
JustinT
Trusted SF Member
Trusted SF Member


Joined: 17 Apr 2003
Posts: 16777215
Location: Asheville, NC, US / Uberlândia, MG, Brazil

Offline

PostPosted: Sun Apr 17, 2005 1:19 pm    Post subject: Some early morning commentary. Reply with quote

Bungle wrote:

I apologise in advance for compressing your very comprehensive last post down in a very over simplified way, but I was wondering am I correct in saying the following ?

“Serpent and Twofish are probably the most secure at the moment but Rijndael has more potential for improvement in the future.”


Well, in the generic sense, in regards to how many rounds have been successfully attacked, as opposed to the total number of rounds, for a given key length. However, it isn't an entirely accurate portrayal of how secure these primitives are against cryptanalysis, in such a way that they can be compared on those grounds alone, so superficially. (i.e., what affect on a particular attack does adding one round have?) This will differ for each primitive, as their round transformations are designed to achieve certain properties, both similar and dissimilar, in different ways.

So, given the configuration in which they are normally used, and referring to the generic notion of a "security margin", your summation is mostly correct. When I began researching the wide trail strategy, in other designs within the same general family of block ciphers from which Rijndael was spawned, I noticed the peculiar modularity behind the strategy.

As such, I concluded that adding rounds would be a relatively simple, for even further security against extensions of attacks, since as the key and block length grow, respectively, more becomes available to work with, for the adversary. This, coupled with achieving certain properties to protect against linear, differential, and truncated differential attacks, just to name a few, means that there's a point when we need a minimum of so many rounds.

After such, adding rounds becomes a conservative measure, in a sense. So, my terminal perspective was that more rounds should have been specified for the use of Rijndael, within the confinement of the AES standard. As I came to realize, several cryptographers had been sharing this opinion, already, quite some time before I pondered over the idea, as my research progressed.

So, I feel that the biggest improvement would be to revise the AES standard and increase the number of rounds, for each key length. Serpent and Twofish are generically more conservative, in this regard. That's really the gist of it. Another point of debate, and one of which I can agree with from both sides, in certain aspects, is the simplicity and complexity of analysis. Allow me to elaborate just a little.

Rijndael is incredibly simple to analyze, largely because of the organization and segregation of its components; it is what you might call "elegant." Rather tidy. Twofish, on the other hand, is often referred to as being a complex block cipher to analyze, largely because of its key-dependent S-boxes and various operations. We certainly want simplicity, but where should the line be drawn?

Would too much simplicity aid an adversary? Would too much complexity deter cryptanalysts from making a meaningful determination of the security of the primitive? Perhaps both cases could be realized. Perhaps for Rijndael and Twofish, their respective strategies are secure enough to mitigate the effects of this issue, if it's really an issue to be concerned with being affected by.

Below is some commentary I shared elsewhere, with the first quoted block addressing a remark that Twofish is too complex, and the second quoted block addressing a successive remark that key-dependent S-boxes were shown to be weak (i.e., would this affect the fact that Twofish uses this type of substitution table).

JustinT wrote:

(While the following comments are not in objection, as I mostly agree with the quoted statements above (as have been documented, almost verbatim, in various publications), I believe there's a little more to it, than writing it off as overly complex. There are several good, simple, principles behind it.)

And so goes the opinionated commentary that flourished throughout the AES selection process. The MARS team was an advocate of comments such as the difficulty of analyzing Twofish based on its construction; as such, there are opposing arguments by the Twofish team, and so on and so forth, for all teams with finalist AEA candidates. In other words, that's looking at it subjectively, when it requires objective thinking to appreciate it, as a whole. It's too opinionated not too. Elegance is subjective. The agitated animadversion-induced scrutiny that each block cipher faced was marked by such contravention, that you're almost forced to understand the mathematics behind the AEA candidates, to properly weigh each perspective, and understand the balance between design criteria and trade-offs.

I think "complex" would be more appropriate than "too complicated", simply because there is a certain level of complexity that was intended, and in all actuality, there is a certain level of simplicity behind the components that Twofish consists of; there is nothing that difficult to understand about the components in the F function, such as pseudo-Hadamard transforms, or maximal distance separable matrices, in the g and h functions. These have been realized as cryptographic components for a decade or longer, so there is a bulk of the cipher that isn't really as complex as it's made out to be. This, in turn, sparks regurgitated comments by the cryptographically uneducated, which causes them to fail to see beyond a comment they read, thus lacking a balanced perception of it. Complexity is also partly subjective; if you look at components separately, as opposed to how they interact, there may be a world of difference.

The peculiar key-dependent substitution tables are, essentially, where this notion of complexity sinks in, but that's nothing we didn't know about already; this is a trade-off that better fit the design rationale, which included meticulously crafted S-boxes, for known attacks (i.e., statistical), and a level of secrecy and key-dependency, for unknown attacks. While the latter may be impossible to actually build to resist, it's not outlandish to assume a certain level of confidence in such. Also, complexity may be seen throughout the various routines (e.g., rotations) and the mixture of algebraic groups represented, and how this affects approximations and their respective accuracy; however, sometimes, when you're addressing other criteria, some complexity will sneak in, inevitably, and sometimes, some complexity isn't necessarily bad. Breaking up structure has its benefits.


JustinT wrote:

(You can view Twofish as having either key-dependent or key-independent S-boxes; they are built from fixed permutations. Learn how they work, to understand that “double perspective.”)

Sure; it's possible. He has also pointed out instances where they increase security over other constructions. A S-box can be poor - key-dependent or not, fixed, random, meticulously chosen, or however you cook them; just as well, it's possible to realize secure instances of these different constructions, under certain conditions. Structurally, if the underlying block cipher is especially susceptible to differential or linear cryptanalysis, no S-box of any form may be sufficient enough to resist it. Aside from how they are built, it depends on how they interact with various other components in the primitive. That's pretty obvious stuff.

Key-dependency can be rather appealing, cryptographically, when done correctly. With key-dependency, it's also possible to make building linear or differential characteristics more difficult. “Correctly” is defined by “how”, “where”, and “why” it's used. It helps to understand the criteria for which they are designed to satisfy – the context.

I believe Twofish uses a design strategy that is not only successful, but useful as a "blueprint" for future designs using similar concepts. Key-dependent S-boxes play a large role in that (e.g., take a look at how key-dependency affects diffusion of the key into the state of output). So, that statement alone is a bit empty, without some sort of context; the context usually depends on which algorithm is in question and what is expected of the S-box(es), given the design rationale.

But, overall, yes – there is nothing invalid about saying that the structural semantics of Twofish are somewhat difficult to analyze, and this did come into play during the selection process, but there is a counter-argument for that; the designers acknowledge both points. Both simplicity and complexity have the tendency of being either “just right” or “too much.” Which, exactly, isn't always obvious. While a certain level of simplicity, on the one hand, is vital in the design of a primitive, too much simplicity, structurally, may facilitate analysis not only for academic cryptanalysts, but for unknown adversaries.

On the other hand, while a seemingly complex structure may make analysis difficult for academic cryptanalysts, our approximations may suffice, and this same difficulty may make analysis just as difficult for an unknown adversary, by reducing overly simplistic structures that may otherwise facilitate attack strategies. Again, it's a trade-off, and one that, I think, serves Twofish well, in a myriad of cases. Of course, to a certain extent, I find it valid to look at this from an opposing perspective, since it's one of those issues that is still open. It's possible to justify both simplicity and complexity, if kept reasonable; this is especially so for complexity, since it truly is, in excess, the cardinal sin of an allegedly, cryptographically, secure system. Keep things as simple as they can be, given your design rationale, which should be reasonable. You have to know when and where to be either.

There is, however, enough understanding for, and literature available on, the individual components, and general Feistel structure, to make a significant effort at analyzing it. My concern is for those who merely read opinionated remarks made during the conferences for the selection of an AEA for the AES, without the realization that Twofish was one of the more rationale-induced candidates. As aforementioned, the perception should be objective, to fully appreciate the design. Cryptanalysis, as we conceptualize these days, is largely theoretical, as are notions of a “margin of security”, which, albeit a useful formulation, is extremely volatile, as are many assertions being made for block cipher security in general. Thus, it's good to explore different routes that lead us to different, sound strategies. The contest was successful – quite a bit more so than many expected.

What we're left with is a design that is among the most flexible we've seen, in terms of trade-off versatility, with a simple, modular concatenation of components, that not only holds up well, but performs well. This is a necessity in a conventional primitive – to secure and perform. So, in conclusion, I feel that the simplicities and complexities of Twofish are decently balanced, and conform quite nicely to the “math and muddle” approach, thus rendering a strategy that has proven to be practically robust, thus far; in fact, it's one of the best performance-driven block ciphers we have, considering the notion of “margin of security.” Once you [the uninformed, in general] comprehend Twofish, and understand what's going on in the functions it is built around, I suppose, then, it may all become blatantly obvious. After all, a clue hasn't been known to jump up and bite someone!

(I'm not partial towards Twofish, for any non-cryptographic reason; these are just observations I've made during mathematical research and a cursory glance at publications. Any level-headed cryptographer (i.e., an academic) should invest a little time into learning these things. Even though recommending the current AES is reasonable, it's just as reasonable to appreciate the design strategy of other finalist AEA candidates, including Twofish, which is one of the most important. Looking at it objectively, and thoroughly, will require a proficiency in the mathematics that compose these primitives, as well.)


As I've stated in the past, either of the three block ciphers should suffice, for any practical purpose; the biggest difference will be obvious in their implementation characteristics in software and hardware. I feel it would be most beneficial to continue analyzing and exploring Rijndael, along with the wide trail strategy. This is one area where simplicity is crucial, since analysis can render more accurate approximations and obtainable security goals, and a lot of ground can be covered in a shorter amount of time.

Also, the indirect benefit of doing so is that a better knowledge of the wide trail strategy will assist us in designing secure hash functions based on it, since hash function design is in dire need of fresh strategies, immediately. After all, a hash function is one of, if not, the most versatile primitive in cryptography.

The bottom line is, we need to continue investigating ways to not only improve the underlying strategy, as used in Rijndael, while retaining its efficiency, but also improve ways to implement it securely in hardware, so side-channel attacks aren't practical; there has been some fascinating work in this area (i.e., cache-timing attack on AES). It's at the implementation level where we should be most concerned. Structurally, I believe Rijndael can be extended to conservative levels of security, generically, and in time, hopefully, we'll be able to construct even better block ciphers, using what we've learned from Rijndael. So, given that, it's worth every effort to retain the use of Rijndael, and not find ourselves in a position where the standard could be abandoned, if unnecessary.
Back to top
View user's profile Send private message Visit poster's website
Bungle
Most Paranoid Member!
Most Paranoid Member!


Joined: 03 Feb 2005
Posts: 2


Offline

PostPosted: Sun Apr 17, 2005 1:47 pm    Post subject: Reply with quote

Hi huh, Very Happy

Quote:

If so, what alternatives do I have to reinforce the encryption and it susteptibity to attack/crack... should I just rely on one encryptio using PGP using a 4096 size key? Whats the best combination.


I think you may find that it is not the encryption that will be your weak point. There are many other things to take into consideration first as I explained earlier in this thread.

Hi JT, Very Happy

I was just in the process of posting this when I noticed your reply. I think it is going to take me a while to digest you response Laughing !! Another JT crypto knockout !! Shocked


Thanks, Wink
Bungle
Back to top
View user's profile Send private message
huh
Just Arrived
Just Arrived


Joined: 07 Apr 2005
Posts: 0


Offline

PostPosted: Tue Apr 19, 2005 1:09 pm    Post subject: Reply with quote

OK now I have stopped layering in PGP, i would like to understand the reason behind it, so bear with me please. The ciphers available in pgp 8.1 do they have the same affect of leaving your message "decrypted" when using encryption layering?

So what is the best combination of ciphers to use in PGP? & i should only encrypt it only ONCE.
Back to top
View user's profile Send private message
helped
Just Arrived
Just Arrived


Joined: 17 Apr 2005
Posts: 0


Offline

PostPosted: Mon May 02, 2005 12:11 pm    Post subject: Reply with quote

i was just browsing through the offical pgp forum and came across a post that those of you discussing about encryption layering may be inrested in here is the posthttp://forums.pgpsupport.com/viewtopic.php?t=2904.

Here is a quote from Nathan the pgp support:

Quote:
The way that block ciphers are implemented with PGP, it doesn't seem likely that you will have your blocks lining up precisely, so if you are encrypting to the same key twice with off the shelf PGP, you will not receive any additional security, I doubt that you would have any additional weakness though, because PGP does attempt compression before encryption, etc. We could probably have a long debate on the subject and have no real answers--just conjecture.

If you wrote a program that would apply AES-128 twice with the same key, you would only get 129 bits of effective encryption. If you did the same thing, but applied the algorithm 1024 times, you would get an effective keylength of 138 bits, heres were the real debate about relative strength and possible cryptanalytic breaks that could be derived from repetative uses of the same key would come in... What it comes down to in the end is that real improvement of security for multiple rounds of encryption require different keys as is demonstrated with Triple-DES.

By performing operations with multiple keys 3DES does create an effectively longer keylength to overcome the weakness that it originally faced with a 56-bit encryption key. (Unfortunately 3DES is also incredibly slow.)

My point in summary is that you probably won't gain any security from running the encryption more than once. You probably want to think about bigger key-lengths as a means of achieving greater security in the actual data-encryption.


...
Arrow
Back to top
View user's profile Send private message
huh
Just Arrived
Just Arrived


Joined: 07 Apr 2005
Posts: 0


Offline

PostPosted: Sat May 07, 2005 12:35 pm    Post subject: Reply with quote

hi helped, i checked the forum out and the pgp suport is saying that one can use more than one different key to for encryption layering because it doubles brute force attack. But that is not what I gathered from this forum.

Can anyone shed some light on this for me? Crying or Very sad
Back to top
View user's profile Send private message
comrade
Just Arrived
Just Arrived


Joined: 15 Feb 2005
Posts: 0


Offline

PostPosted: Sat May 07, 2005 1:15 pm    Post subject: Reply with quote

Dont bothering trying to layer encryption.

You have bigger worrys.
Back to top
View user's profile Send private message
huh
Just Arrived
Just Arrived


Joined: 07 Apr 2005
Posts: 0


Offline

PostPosted: Mon May 09, 2005 11:51 am    Post subject: Reply with quote

comrade wrote:
Dont bothering trying to layer encryption.

You have bigger worrys.


another member against encryption layering! why? pgp support says it works

what are these other worries, if i may ask Rolling Eyes
Back to top
View user's profile Send private message
Display posts from previous:   

Post new topic   Reply to topic   Printer-friendly version    Networking/Security Forums Index -> Cryptographic Theory and Cryptanalysis - Internal and Transmission Security All times are GMT + 2 Hours
Goto page 1, 2  Next
Page 1 of 2


 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum

Community Area

Log in | Register