[00:00:08] Speaker 00: Alright. [00:00:36] Speaker 00: Good morning. [00:00:46] Speaker 04: Please have a seat. [00:00:52] Speaker 04: We have a full day in front of us today. [00:00:56] Speaker 04: We're going to hear four cases on oral argument. [00:01:01] Speaker 04: We're deciding two cases on the briefs. [00:01:04] Speaker 04: Our first case this morning is Hillcrest Laboratories, Inc. [00:01:08] Speaker 04: vs. Novea, Inc., Mr. Barney. [00:01:13] Speaker 04: And I understand you're reserving now five minutes for rebuttal. [00:01:16] Speaker 04: Yes, Your Honor. [00:01:18] Speaker 02: All right. [00:01:18] Speaker 02: And please report me and Barney on behalf of the Hillcrest Lab. [00:01:22] Speaker 02: There are two claim construction issues on appeal. [00:01:26] Speaker 02: And I'd like to begin with what is perhaps the easier of the two. [00:01:31] Speaker 02: And that is the rotational output situation. [00:01:33] Speaker 02: The reason I believe that is the easier of the two is that the parties actually agree on the correct claim construction of this term. [00:01:40] Speaker 02: They also agree there appears to be no dispute that the board's analysis of this limitation was wrong. [00:01:47] Speaker 02: So the only question is whether there is an alternative ground for affirmance that this court can reach, and we believe the answer is no. [00:01:53] Speaker 02: Every claim at issue in this appeal requires a tilt compensation equation that has three inputs. [00:01:58] Speaker 02: Tilt is denoted as theta. [00:02:01] Speaker 02: a first rotational output alpha y and a second rotational output alpha z. The parties on appeal have agreed that rotational output as properly construed means the sample output of a rotational sensor such as a gyroscope. [00:02:14] Speaker 02: So these claims require two rotational sensors that measure rotation about the y and the z axes and using the nomenclature of the liberty pattern that corresponds to horizontal and vertical movements of the cursor on the screen. [00:02:29] Speaker 02: Now the board mapped [00:02:30] Speaker 02: the equation in column 21 of EDEA, which is the prior art reference that's at issue here, to the claimed equations in the Liberty Patterns. [00:02:38] Speaker 02: In doing so, it acknowledged that the horizontal and vertical components of that equation in EDEA are derived from linear sensory outputs, not rotational sensory outputs. [00:02:50] Speaker 02: Again, Muffia does not dispute that the board's analysis in that regard was incorrect, because as the parties now agree, [00:02:57] Speaker 02: Rotational outputs means the output of a rotational sensor, not a linear sensor. [00:03:03] Speaker 02: So Muvia is not defending that rationale on appeal. [00:03:07] Speaker 02: Instead, what it's doing is arguing that the board's judgment, although it includes an error in the analysis, can nevertheless be affirmed on alternative grounds. [00:03:17] Speaker 02: Specifically, Muvia contends that EA teaches as an alternative [00:03:21] Speaker 02: the use of rotational sensor outputs as inputs to the equation in column 21 of EDAY. [00:03:28] Speaker 02: We disagree that EDAY discloses that. [00:03:32] Speaker 02: There is only one rotation compensation equation in EDAY. [00:03:37] Speaker 02: In column 21, it corresponds to the fourth embodiment of EDAY. [00:03:41] Speaker 02: That's the embodiment that the examiner and the board relied on in rejecting the claims. [00:03:45] Speaker 02: There are no other compensation equations anywhere else in EDAY. [00:03:50] Speaker 02: And as the board correctly found, that compensation equation in column 21 of EDAY is based on linear sensor outputs, non-rotational sensor outputs. [00:04:00] Speaker 02: And again, MUVIA is not attempting to defend that erroneous rationale. [00:04:07] Speaker 02: So we believe that there is a factual dispute here about whether or not EDAY actually discloses that. [00:04:13] Speaker 02: But what's important is there are no factual findings in the record [00:04:17] Speaker 02: that would support Movia's alternative argument that it's making on appeal. [00:04:21] Speaker 02: As I said, it's undisputed that the board relied on linear sensor outputs when it mapped EA's equation to the claim compensation equation. [00:04:30] Speaker 02: And so you won't find any factual findings in that portion of the board's opinion that support the alternative theory being presented by Movia. [00:04:40] Speaker 02: So what Movia is pointing to for its factual support [00:04:43] Speaker 02: is the board's anticipation findings for two dependent claims, dependent claims 6 and 17. [00:04:49] Speaker 02: And it tries to bootstrap from these findings a more granular finding regarding the use of rotation sensors in EDA's compensation equation. [00:04:58] Speaker 02: The problem is, and this is all in the briefs, the problem is when you look at the limitations that are actually added in those dependent claims, they don't recite the use of two rotation sensor outputs as the inputs to a compensation equation. [00:05:12] Speaker 02: And so you can't infer anything about that limitation from the board's finding on those two events. [00:05:20] Speaker 04: Counsel, I found this case complex. [00:05:26] Speaker 04: And to be honest, the briefing didn't help much. [00:05:31] Speaker 04: Let me ask you, how do the claims treat inertial frame of reference? [00:05:36] Speaker 04: How is that different? [00:05:37] Speaker 04: from users from a reference. [00:05:39] Speaker 04: Sure, Your Honor. [00:05:39] Speaker 02: That's a different limitation, but I'm happy to turn to that now. [00:05:42] Speaker 04: Let's go to that one. [00:05:43] Speaker 02: I wanted to start with the rotation one, by the way, just because I think it's a similar issue. [00:05:47] Speaker 02: They're asking for an alternative ground for affirmance. [00:05:49] Speaker 02: There are no facts to support that alternative ground. [00:05:51] Speaker 02: It requires a remand. [00:05:52] Speaker 02: I'm happy to turn to the other limitations. [00:05:55] Speaker 02: So Liberty teaches rotating measured movements into two different types of frames of reference. [00:06:02] Speaker 02: a user frame of reference, or an inertial frame of reference. [00:06:05] Speaker 02: And as Your Honor put your finger on, the dispute between the parties here on appeal is whether there's a difference between those two. [00:06:12] Speaker 02: And if so, whether the board's construction of inertial frame of reference actually captures that distinction. [00:06:18] Speaker 02: The board construed or adopted the construction of inertial frame of reference from what we have argued in its petition, which is a frame of reference associated with a screen's orientation. [00:06:30] Speaker 02: Now, MVIA has put forth a new construction on appeal that it didn't argue below. [00:06:34] Speaker 02: But this court is being asked to review the construction that the board actually adopted and used in its rejections. [00:06:40] Speaker 02: And that construction was a frame of reference associated with a screen's orientation. [00:06:44] Speaker 02: The problem we have with that construction is that it gives no meaning to the word inertial. [00:06:49] Speaker 02: It's really just the same construction that would apply to a user's frame of reference. [00:06:53] Speaker 02: And in fact, the board's construction, if you look at the wording of it, comes almost verbatim [00:06:58] Speaker 02: from a description in the liberty patent of an example of a user's frame of reference. [00:07:02] Speaker 02: I would refer the court to appendix 47, column 16, lines 28 through 30. [00:07:08] Speaker 02: Now the board disagreed with our argument and said that its construction does give sufficient meaning to the word inertial, and this is the board's rationale. [00:07:17] Speaker 02: According to the board, if you have a screen that's oriented vertically, a wall or sitting on a desk, and then if you associate [00:07:25] Speaker 02: frame of reference of the device with that screen, well, now the device's frame of reference will be associated with a frame of reference that is oriented according to gravity. [00:07:34] Speaker 02: And therefore, the device's frame of reference will also be inertial. [00:07:37] Speaker 02: That's the board logic. [00:07:39] Speaker 02: That was also the logic of the examiner. [00:07:41] Speaker 02: There is a flaw in that logic. [00:07:43] Speaker 02: And the key word, and what I believe is the heart of the board's error, is the word associated, which is, by the way, a word that Evia seems to forget about in its brief. [00:07:53] Speaker 02: In EDAY, which is the prior art that we're arguing about here today, it is the user who associates the device with the screen. [00:08:05] Speaker 02: And he does so by looking at the screen's orientation and essentially mimicking that orientation in the handheld device. [00:08:11] Speaker 02: That's what the illustrations in EDAY are showing. [00:08:13] Speaker 02: You either orient it with a laptop that's tilted backwards, or you orient it with a screen that may or may not be sitting on a flat table. [00:08:20] Speaker 02: But regardless of how you do it, it's a manual process. [00:08:23] Speaker 02: Why? [00:08:24] Speaker 02: Because the device in E-Day cannot sense the screen. [00:08:27] Speaker 02: The device in E-Day has no idea what the orientation of the screen is. [00:08:31] Speaker 02: Moreover, the device in E-Day doesn't know its own orientation relative to the screen. [00:08:35] Speaker 02: As far as the device in E-Day is concerned, the screen doesn't exist. [00:08:39] Speaker 02: It can't see it, it can't sense it, and it doesn't know its orientation relative to that screen. [00:08:43] Speaker 02: So the only way that you can, quote, associate the frame of reference of the device with the screen is to do it manually, which is exactly what EDA teaches. [00:08:51] Speaker 02: So your opponents argue that the proper frame of reference is the earth frame. [00:08:55] Speaker 02: That's what they're arguing on appeal. [00:08:56] Speaker 02: It's not exactly what they argued below. [00:08:58] Speaker 02: It's a new construction. [00:08:59] Speaker 02: Did they make that argument at all below? [00:09:02] Speaker 02: The claim construction they proposed below was the one the court adopted, which is a frame of reference associated with the screen. [00:09:11] Speaker 02: the new argument they've presented here still is incorrect because they're presenting it in a way that would still allow a quote inertial frame of reference to be defined completely by a user with no sensing of any external field such as gravity, which is exactly what E-Day teaches. [00:09:31] Speaker 02: And the problem with that type of frame of reference, I mean I say it's a problem, I would say a feature of that type of frame of reference is that because the device doesn't know [00:09:41] Speaker 02: its own orientation relative to the screen, if the user gets it a little bit wrong, which is often the case because it's very difficult to manually align something to a screen that is 15 or 20 feet away. [00:09:52] Speaker 02: And so if the user is off by maybe two or three degrees when he sets that frame of reference, the device isn't going to know that. [00:09:59] Speaker 02: The device is going to assume that that is the correct frame of reference for the screen. [00:10:02] Speaker 02: And then thereafter, every movement of the user will be a little bit off. [00:10:07] Speaker 04: Does the device know that the display unit is [00:10:11] Speaker 04: substantially aligned with gravity? [00:10:13] Speaker 02: No, it does not, Your Honor. [00:10:14] Speaker 02: The device in EDA only has a rotational sensor. [00:10:18] Speaker 02: And a rotational sensor is a relative sensor. [00:10:21] Speaker 02: All it can detect is a relative rotation of the device. [00:10:24] Speaker 02: It can tell that it's moving clockwise. [00:10:26] Speaker 02: It can tell it's moving counterclockwise. [00:10:28] Speaker 02: And you can do some calculations to figure out how far it's moved. [00:10:32] Speaker 02: But it has no idea what gravity it is. [00:10:34] Speaker 04: But yet the figures, figure 33, 35A, 35B, and EDA, they all show [00:10:40] Speaker 04: which clearly displays that are aligned with gravity. [00:10:44] Speaker 04: I mean, they're vertical. [00:10:45] Speaker 02: They're all vertical. [00:10:46] Speaker 02: Your Honor, this gets to the inherency argument, and I will grant you that that is one potential interpretation of that page. [00:10:54] Speaker 02: But I think the more salient argument is, even if we assume they're vertical, even if I grant you that those screens are vertical, that doesn't mean that a user who's standing 20 feet away and manually aligning the device with the screen is holding the device vertical. [00:11:07] Speaker 02: That's where the inherency comes in. [00:11:08] Speaker 02: The examiner assumed that the user is going to perfectly align that device with gravity simply because he's eyeballing it to a screen that may itself be aligned with gravity. [00:11:19] Speaker 02: But we're getting back to this idea of inertial frame of reference. [00:11:22] Speaker 02: The Liberty patent acknowledges what EDAG uses as a frame of reference. [00:11:27] Speaker 02: And the Liberty patent calls that a user's frame of reference because the user defines it. [00:11:32] Speaker 02: It's not based on anything that's sensed outside of the device itself. [00:11:36] Speaker 02: And he also uses a different frame of reference called an inertial frame of reference. [00:11:40] Speaker 02: And the patent treats those two frames of reference distinctly. [00:11:44] Speaker 02: They never are equated in the patent. [00:11:46] Speaker 02: They're never said to be synonymous with each other. [00:11:48] Speaker 02: This court's precedent requires that different terms that are used in the claims and in the specification are presumed to have different meanings unless proven otherwise. [00:11:57] Speaker 01: Here there's no proof that they were used to have same meanings. [00:11:59] Speaker 01: I'm still confused. [00:11:59] Speaker 01: I mean, what's the point of having a different [00:12:04] Speaker 01: The inertial frame of reference be a different frame of reference than the user frame of reference. [00:12:09] Speaker 01: I mean, you don't want the device to have a frame of reference that's different than what's on the screen because then it's not showing on the screen properly. [00:12:17] Speaker 02: Your Honor, the difference is that an inertial frame of reference is only going to work [00:12:23] Speaker 02: when you're dealing with a screen that is aligned to the same inertial field that you're using. [00:12:28] Speaker 02: So it's a more limited example. [00:12:30] Speaker 02: That's why the patent talks about a new inertial frame. [00:12:33] Speaker 01: When it's up and down for understanding. [00:12:36] Speaker 01: What universe on this planet would you have any other kind of reference [00:12:41] Speaker 01: than one that's going up and down? [00:12:44] Speaker 02: Well, Your Honor, a user's frame of reference can be any frame of reference. [00:12:47] Speaker 02: So if you had a screen that was not oriented vertically, you could create your own user frame of reference that is oriented to that screen instead of one that is straight in front of you. [00:12:59] Speaker 02: Or if you're laying on your side and you're too lazy to sit up and move the cursor this way, you could define a frame of reference [00:13:05] Speaker 02: where you can move it while you're on your side, and the frame of reference will understand that when you do that, what you're really intending to do is this. [00:13:13] Speaker 02: That's what a user frame of reference can do for you. [00:13:16] Speaker 02: The patent describes an inertial frame of reference and the techniques for rotating into an inertial frame of reference as a subset of a broader set of techniques for translating into a user's frame of reference, which proves that it's a different scope. [00:13:30] Speaker 02: They're not equated. [00:13:32] Speaker 02: OK, you're into your rebuttal time. [00:13:33] Speaker 04: Thank you, Your Honor. [00:13:35] Speaker 04: We'll restore you back to four minutes. [00:13:38] Speaker 04: Yeah, four. [00:13:52] Speaker 03: May it please the court, I'm here, yeah, I am appearing here on behalf of Nubia. [00:13:58] Speaker 03: If I may, I'd like to start out by addressing one of the more recent points that were made by my colleague for Hillcrest. [00:14:06] Speaker 03: And this is the concept of eyeballing the frame of reference. [00:14:12] Speaker 03: It's an argument that Hillcrest is very fond of making, and the examiner disagreed with it, the board disagreed with it, and if you read it clearly, it shoots down that argument. [00:14:23] Speaker 03: What Hillcrest does is they cite from column 20, [00:14:28] Speaker 03: of eating, which is describing the problem that you would have if you're not perfectly aligned. [00:14:35] Speaker 03: This is at California beginning of 1945. [00:14:44] Speaker 03: Appendix page. [00:14:50] Speaker 03: That's appendix page 760. [00:14:53] Speaker 03: I'm sorry, 766? [00:15:05] Speaker 03: So that's talking about the problem. [00:15:07] Speaker 03: Where at? [00:15:08] Speaker 03: Sure, beginning at line 45. [00:15:11] Speaker 03: Which column? [00:15:12] Speaker 03: Column 20. [00:15:17] Speaker 03: So there's a paragraph that begins, figures 28A to 28D. [00:15:20] Speaker 03: Then it goes on to say, the cursor moves in a different direction from his hands movement in space unless the operator holds the mouse [00:15:32] Speaker 03: so that the horizontal and vertical directions of the mouse may correspond to the horizontal and vertical directions of the screen. [00:15:39] Speaker 03: So that's describing what may happen if you're not perfectly aligned. [00:15:44] Speaker 03: And that's what Hillcrest cites. [00:15:46] Speaker 03: If you continue on to column 21, and this is a very important point, beginning at line 10 on the next page 761 of the appendix, [00:16:00] Speaker 03: If you continue reading at column 21, beginning at line 10, E-Day, the prior artist, to overcome this problem, it then talks about using the rotation amount sensing element to figure out how much of a difference in rotation is there. [00:16:19] Speaker 01: But I think what your friend is arguing is that that still doesn't do what their patent does, which is [00:16:27] Speaker 01: automatically hook up the device with the inertial frame of reference, you know, based on gravity, that you first, under EDAY, still have to have it set manually to say this is the frame of reference, and then all it does is, once you've got it set, controls for rotation. [00:16:49] Speaker 01: I mean, I have the same concerns that my colleague did. [00:16:54] Speaker 01: This case is [00:16:55] Speaker 01: bafflingly complex and I think you've both done a not very good idea of explaining the technology. [00:17:00] Speaker 01: And it's not all that complicated of technology. [00:17:03] Speaker 01: It's pointers that operate not on the table but in space. [00:17:08] Speaker 01: But from this, I don't get why this is saying that it doesn't do what [00:17:14] Speaker 01: he says, although I also could agree with you that it does do. [00:17:19] Speaker 01: I just thought, where are we? [00:17:20] Speaker 03: It's, it's, E-Day is solving the same problem, which is when there's a mismatch between what the user intends and what shows up on the screen. [00:17:30] Speaker 03: And so it's solving that same problem, and it's doing it by figuring out, okay, how much of an error is there? [00:17:37] Speaker 03: But what state does it calculate that error from? [00:17:42] Speaker 03: It calculates it. [00:17:43] Speaker 03: Well, the examiner. [00:17:45] Speaker 01: Let me give you one more try. [00:17:47] Speaker 01: I have no idea whether this is right because it's complicated and I hear two different explanations. [00:17:55] Speaker 01: But it seems to me that your friend is saying, our device, you don't have to do kind of anything. [00:18:01] Speaker 01: It just automatically hooks up with the screen and recognizes that the screen is in a certain rotation and does it. [00:18:09] Speaker 01: and then controls for the rotation thereafter. [00:18:12] Speaker 01: It seems like they're saying EDAY doesn't do that, that first you have to kind of, the user has to manually set and say, here's the cursor at a certain thing, and all it does is correct afterwards. [00:18:24] Speaker 03: Do you understand what I... Sure, if I can address the manual setting part, EDAY discloses two different inventions. [00:18:31] Speaker 03: One is the cursor control, which is what's relevant to here. [00:18:35] Speaker 03: EDAY also discloses something completely irrelevant to us, [00:18:38] Speaker 03: a pattern input, a gesture input where you're going to move your hand in a certain pattern, and it detects that pattern. [00:18:44] Speaker 03: That button in EDAY is to switch from one mode to the other. [00:18:49] Speaker 03: It's not for saying, okay, I am now setting my frame. [00:18:52] Speaker 03: This is the frame that I want. [00:18:54] Speaker 03: It's just telling the device I'm switching from one mode to the other. [00:18:58] Speaker 03: So back to your point of how does EDAY know what that offset is? [00:19:04] Speaker 03: And there's two responses to that. [00:19:08] Speaker 03: EDAY mentions that the problem is that there's an offset, that there's a rotational offset. [00:19:14] Speaker 03: And so to solve that, EDAY discloses using a rotation amount sensing element. [00:19:21] Speaker 03: EDAY never describes the specific technology for implementing that. [00:19:27] Speaker 03: And in the re-examination, the examiner agreed that it would either be inherent to use an accelerometer [00:19:36] Speaker 03: because that was something very well known at the time for determining that. [00:19:41] Speaker 03: And an accelerometer is something that can determine how much you've shifted relative to the gravity axis. [00:19:50] Speaker 03: So that's what the examiner found. [00:19:53] Speaker 03: Does that answer the question hopefully a little bit better? [00:19:57] Speaker 01: Well, you said the examiner found that, and I do see where the examiner [00:20:01] Speaker 01: found that EDAY could at least inherently disclose an accelerometer. [00:20:05] Speaker 01: I didn't see the board necessarily relying on that. [00:20:09] Speaker 03: Well, the board actually did agree with that. [00:20:15] Speaker 03: And what the examiner found was that EDAY just discloses a rotation amount sensing element without giving a specific piece of hardware to do it. [00:20:25] Speaker 03: And the examiner also found that EDAY never [00:20:29] Speaker 03: really precluded any particular element. [00:20:31] Speaker 03: The examiner also found, based on the evidence that was presented and her analysis of it, that it would be inherent to use an accelerometer. [00:20:40] Speaker 03: That's something that was very well known to a person of ordinary skill at the time for sensing these types of things. [00:20:46] Speaker 03: And there was a secondary reference that was submitted on the ATX analog devices data sheet, which disclosed an accelerometer used for this particular type of application. [00:20:57] Speaker 03: So either it was inherent [00:20:59] Speaker 03: or it's e-day plus a combination of the well-known accelerometer. [00:21:07] Speaker 03: Back to, I really want to talk about this rotational output issue. [00:21:13] Speaker 03: If you look at the board's decision, this is in the appendix, page 11 to 12. [00:21:26] Speaker 03: It's kind of confusing, [00:21:28] Speaker 03: page 10 of the board's decision to amend page 11 of the appendix. [00:21:34] Speaker 03: The paragraph at the end of the bottom of the page, in order to refine error. [00:21:39] Speaker 03: So there, the board is addressing this issue of, does EDAY provide a linear output, or does it provide a rotational type of output? [00:21:51] Speaker 03: And the board starts out by discussing what the examiner found. [00:21:55] Speaker 03: And they say, you know, nor do we find error in examiner's mapping of these values to the recited rotational outputs, because the outputs depend on the rotational shift, which is the theta m. And so they say they are rotational in that sense. [00:22:11] Speaker 03: The next sentence, you have to read very carefully. [00:22:14] Speaker 03: They say, this is the board, we reach this conclusion despite the values x prime and y prime representing linear movements. [00:22:23] Speaker 03: as patent owner contends. [00:22:26] Speaker 03: So yes, they use the word linear, but they're talking about what the patent owner contended. [00:22:33] Speaker 03: And if you go back and look at what the examiner had said, which the board was reviewing here, the examiner was very clear that she said that E-Day discloses rotational outputs because it's a shift in the angle. [00:22:50] Speaker 03: And what the examiner said [00:22:53] Speaker 03: is at page 838 in the appendix. [00:23:06] Speaker 03: Towards the bottom of the page, about three, four lines up, this is in the right of appeal notice. [00:23:13] Speaker 03: The examiner says that x prime and y prime depend on the angle, et cetera, theta m. Consequently, they are rotational outputs. [00:23:23] Speaker 03: And to support that, if you read Ide, Ide talks about using piezoelectric gyroscopes to sense the rotation, and also says that those elements can determine angular velocity. [00:23:41] Speaker 03: And angular velocity is a concept of rotation, how much of your speed as you're moving along a circle, as opposed to a linear movement. [00:23:53] Speaker 03: So what we have here is the board reviewing what the examiner said. [00:23:59] Speaker 03: And the examiner said, these are rotational outputs because E-Day discloses them that way. [00:24:04] Speaker 03: E-Day talks about sensing angular velocity. [00:24:08] Speaker 03: And the board then says, we reached the same conclusion despite patent owner contending that these are linear movements. [00:24:21] Speaker 04: Let me take you back to the construction of inertial frame of reference. [00:24:24] Speaker 04: Yes. [00:24:26] Speaker 04: In your brief, you say, it's clear that the PTO understood inertial frame to mean a frame that is fixed relative to the earth frame. [00:24:35] Speaker 04: And this argument that you're making, which was pretty lengthy in your brief, addresses the issue whether the board explicitly identified the frame that it believes is relative to the inertial frame. [00:24:49] Speaker 04: And your response to your argument [00:24:51] Speaker 04: is that it doesn't matter because it's clear that the board was referencing to a frame that is fixed relative to the earth frame. [00:25:02] Speaker 04: Where in the decision is the term earth frame used? [00:25:07] Speaker 03: The term itself, earth frame, is not used. [00:25:10] Speaker 03: But what is used in the board's decision when they arrived at that claim construction, they did address the concept of inertial. [00:25:20] Speaker 03: that the frame of reference, as they define it, the inertial frame is associated with the screen orientation. [00:25:27] Speaker 03: Well, I get that. [00:25:29] Speaker 04: It just makes me uneasy when I see pages of argument that are based on the premise that, as you're pointing out now, it doesn't exist. [00:25:40] Speaker 04: This statement is incorrect, isn't it? [00:25:43] Speaker 03: Not as incorrect. [00:25:43] Speaker 03: I think it's a different way of describing what that frame of reference is. [00:25:48] Speaker 03: because I mean it'd be a different way well it may be different but you're using a term that's just you know came out of I don't know where it came from sure I think it was meant as really a synonym for gravity and when gravity is relative to the earth because it's a force from the center of the earth okay that I understand yeah yeah but you're using a term that's just not [00:26:18] Speaker 03: in the record. [00:26:23] Speaker 03: But it's the exact same concept, because the inertial frame or gravity frame, you have three axes. [00:26:30] Speaker 03: One is the gravity, which is perpendicular to the surface of the Earth. [00:26:35] Speaker 03: Then you have the two other axes that give you the motion, basically. [00:26:40] Speaker 03: And so that earth frame is the same thing as the gravity frame, as what's being talked about in these patents here. [00:26:48] Speaker 03: Now, at one point, if I might add, the claims nowhere talk about sensing a field or sensing gravity or anything like that. [00:26:59] Speaker 03: And so what the board did was they defined the frame with respect to what this device was doing, which is, OK, I need to align with the frame of the screen. [00:27:11] Speaker 03: And as you pointed out, all those drawings show the screen as being vertical. [00:27:18] Speaker 03: It essentially is the same as the gravity axis or the earth frame. [00:27:24] Speaker 04: Well, you have figure 3, and the screen in that figure is not vertical. [00:27:30] Speaker 03: Yes, but that's a different embodiment of the E-day. [00:27:33] Speaker 03: And everyone agrees that that one is clearly tilted. [00:27:36] Speaker 03: It's depicted as tilted. [00:27:37] Speaker 03: But when you look at the other figures, I think it was 33 and 35 with the big boxy TV's, those are essentially vertical. [00:27:45] Speaker 03: One other point I'd like to make, the claim construction proposed by Hillcrest talks about this external field and gravity, but when you read the 118 patent carefully, yeah, they talk about using accelerometers and the like, but they also talk about using different types of sensors that have nothing to do with fields like gravity. [00:28:09] Speaker 03: If you look at, for example, column 16, line 44, [00:28:14] Speaker 03: This is at page 47 in the appendix. [00:28:26] Speaker 03: Column 16, line 44. [00:28:29] Speaker 03: They list a number of types of sensors. [00:28:32] Speaker 03: Accelerometers are one of them. [00:28:34] Speaker 03: They also list cameras as being a type of sensor you could use to implement this invention. [00:28:42] Speaker 03: I don't think I know of any camera that senses gravity. [00:28:48] Speaker 03: Yeah, cameras that sense the angle of the horizon. [00:28:55] Speaker 03: But that's not sensing gravity. [00:28:57] Speaker 03: And if you talk about a conventional camera, that's a visual image. [00:29:04] Speaker 03: I mean, you asked me, and I'm the wrong person to ask. [00:29:08] Speaker 03: I was just expressing my own... So was I. But let's not do that. [00:29:13] Speaker 03: The additional point I want to make is that throughout the prosecution, from the original prosecution to the re-exam, the terms inertial frame and user frame were essentially used interchangeably. [00:29:25] Speaker 03: Going back to the notice of allowability from the original examiner, they treated the two terms essentially the same. [00:29:31] Speaker 03: Hillcrest and its argumentation treats them the same. [00:29:34] Speaker 03: The patent, and the $64 question is, if these terms are so significant yet so different, why won't Hillcrest have really helped themselves out by giving us a clear definition and a clear demarcation in the patent? [00:29:48] Speaker 03: User frame is this, inertial frame is that. [00:29:52] Speaker 04: Okay, do you want to conclude? [00:29:53] Speaker 03: You're out of time. [00:29:54] Speaker 03: I'm out of time. [00:29:55] Speaker 03: I appreciate your indulgence. [00:29:57] Speaker 03: Thank you. [00:30:04] Speaker 04: Thank you, Your Honor. [00:30:06] Speaker 02: Just a few points. [00:30:09] Speaker 02: With respect to the E-Day reference, if I could refer the court to appendix 746. [00:30:19] Speaker 02: This goes to the rotational question about whether or not the [00:30:27] Speaker 02: Correction equation uses linear sensor outputs or rotational sensor outputs. [00:30:31] Speaker 02: And your honor, I do see. [00:30:33] Speaker 01: Is this what you're showing us? [00:30:34] Speaker 01: Yes, your honor. [00:30:37] Speaker 01: Well, I mean, this is the whole problem with this case. [00:30:39] Speaker 01: I mean, I don't know about my colleagues, but I don't have this kind of background to understand what this is going to say. [00:30:46] Speaker 01: And so you're going to tell me one thing. [00:30:48] Speaker 01: He would have probably told me the other thing. [00:30:50] Speaker 01: I bet the board's saying one thing. [00:30:51] Speaker 01: How do I find a lack of substantial evidence for this? [00:30:55] Speaker 01: I appreciate what you're saying, Your Honor. [00:30:57] Speaker 01: Why don't you address the parts that he showed me about where the examiner talked about the assortational outputs and things like that in the board's decision. [00:31:06] Speaker 01: You can go ahead and do this. [00:31:08] Speaker 01: But like I said, if you want me to understand this chart, I'm either going to have to take your word for it or not. [00:31:16] Speaker 01: I don't think anything you're going to say about it is going to convince me of some objective truth about what this says. [00:31:21] Speaker 01: Well, let me just take a step back then and hopefully I'll address your question. [00:31:24] Speaker 04: Let me ask you this question. [00:31:26] Speaker 04: So your opponent went and asked him about the earth frame. [00:31:29] Speaker 04: He said, well, there's no earth frame involved, but it's gravity frame. [00:31:33] Speaker 04: And he explained to me that the gravitational forces of earth, is the term gravity frame used anywhere? [00:31:40] Speaker 02: No, Your Honor. [00:31:41] Speaker 02: The only terms that are used in the patent are inertial frame of reference and user frame of reference. [00:31:45] Speaker 02: And Judge Hughes, you didn't [00:31:47] Speaker 02: correctly state the issue here, which is that in EDAY there is no way to sense the screen and therefore there is no way to automatically rotate into that screen's frame of reference. [00:32:00] Speaker 02: The only way it can be done is to manually align the device to the screen and then from that point forward, and there has to be some sort of button or way to tell the device, this is now the frame of reference I want you to rotate into. [00:32:13] Speaker 02: There are six embodiments in EDAY. [00:32:16] Speaker 02: The first three don't use rotation at all. [00:32:18] Speaker 02: There's no correction at all. [00:32:19] Speaker 02: And those are the cursor embodiments. [00:32:21] Speaker 02: And so the first three embodiments of EDAY have a cursor on a screen and they don't have any way to compensate for tilt. [00:32:28] Speaker 02: The tilt compensation comes in in the fourth embodiment, which is dealing with gesture recognition. [00:32:33] Speaker 02: What the examiner tried to do is to say, well, I'm going to take just that rotation correction circuit and rotation correction equation, and I'm going to graph that onto a cursor control side of the body, which is fine. [00:32:46] Speaker 02: But when you do that, you have to take the entire circuit and the entire equation. [00:32:50] Speaker 02: And what that requires is a setting to tell the device, here is the frame of reference that I want to use. [00:32:57] Speaker 02: And you put your finger on it exactly. [00:32:59] Speaker 02: It's the difference between automatic correction [00:33:01] Speaker 02: This is what an inertial frame of reference will do for you, and a manual correction. [00:33:05] Speaker 02: That is the difference between a user frame of reference and an inertial frame of reference. [00:33:10] Speaker 01: I'm getting there a little bit more. [00:33:12] Speaker 01: You're being much more helpful than you are in the brief. [00:33:15] Speaker 01: What you're saying is your device, you turn it on because it has these rotational sensors and not just a gyroscope or whatever that it shows, it knows which way it's down automatically. [00:33:28] Speaker 01: And the screen is down. [00:33:31] Speaker 01: I assume we can pose hypotheticals about a screen being like this and this, but that's not what we're really talking about. [00:33:37] Speaker 01: People don't work with screens that are sideways or something like that. [00:33:40] Speaker 01: I mean, maybe there's some cell or something. [00:33:42] Speaker 01: So your device knows what is down. [00:33:45] Speaker 01: I'll use this as an example. [00:33:46] Speaker 01: Yeah, yeah. [00:33:46] Speaker 01: And so EDA doesn't show what's down, except that the examiner didn't the examiner find that it would be inherent to use an accelerometer with EDA. [00:33:57] Speaker 01: Yeah, Your Honor. [00:33:58] Speaker 02: I didn't say that before. [00:34:01] Speaker 02: The examiner didn't find that it was inherent. [00:34:03] Speaker 02: The examiner found it was obvious to replace a rotation sensor, which is rotating, and replace it with an accelerometer for detecting relative rotation. [00:34:15] Speaker 02: Not for detecting tilt relative to gravity. [00:34:17] Speaker 02: It was a one-for-one replacement. [00:34:19] Speaker 02: So once you do that replacement, you're still detecting relative... But an accelerometer knows what down is. [00:34:26] Speaker 02: And the replacement that the board found would be obvious was not to replace it with an accelerometer that measures tilt relative to absolute value and then uses that in the rotation equation. [00:34:36] Speaker 02: That would be an additional step of logic and technology that the examiner did not find. [00:34:42] Speaker 02: And the board certainly didn't find that. [00:34:44] Speaker 02: And so I think you basically boiled it down correctly. [00:34:47] Speaker 02: It's the difference between automatic correction versus having to do it manually. [00:34:51] Speaker 02: That's the difference between the two frames of reference. [00:34:53] Speaker 02: And we would submit that the examiner did not use, the ward did not use the correct construction. [00:34:59] Speaker 02: He basically equated those two. [00:35:02] Speaker 02: Okay. [00:35:02] Speaker 02: We got your argument. [00:35:03] Speaker 02: Thank you very much. [00:35:04] Speaker 02: Thank you.