Apr 16 18:13:55.741622 ip-10-0-131-203 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 18:13:55.741633 ip-10-0-131-203 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 18:13:55.741643 ip-10-0-131-203 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 18:13:55.741944 ip-10-0-131-203 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 18:14:05.980444 ip-10-0-131-203 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 18:14:05.980462 ip-10-0-131-203 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot d40e20146e1f4a16b3a78c249a74a86d -- Apr 16 18:16:38.365856 ip-10-0-131-203 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:16:38.849668 ip-10-0-131-203 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:16:38.849668 ip-10-0-131-203 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:16:38.849668 ip-10-0-131-203 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:16:38.849668 ip-10-0-131-203 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:16:38.849668 ip-10-0-131-203 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:16:38.851694 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.851609 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:16:38.854914 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.854896 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:16:38.854914 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.854912 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:16:38.854914 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.854917 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:16:38.855093 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.854921 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:16:38.855093 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.854925 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:16:38.855093 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.854929 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:16:38.855093 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.854933 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:16:38.855093 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.854937 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:16:38.855093 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.854941 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:16:38.855093 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.854945 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:16:38.855093 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.854949 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:16:38.855093 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.854953 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:16:38.855093 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.854957 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:16:38.855093 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.854960 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:16:38.855093 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.854964 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:16:38.855093 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.854968 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:16:38.855093 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.854973 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:16:38.855093 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.854977 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:16:38.855093 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.854987 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:16:38.855093 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.854992 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:16:38.855093 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.854996 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:16:38.855093 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855000 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:16:38.855093 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855004 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:16:38.855900 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855008 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:16:38.855900 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855012 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:16:38.855900 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855016 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:16:38.855900 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855020 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:16:38.855900 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855024 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:16:38.855900 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855028 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:16:38.855900 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855032 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:16:38.855900 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855037 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:16:38.855900 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855041 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:16:38.855900 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855045 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:16:38.855900 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855049 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:16:38.855900 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855053 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:16:38.855900 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855057 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:16:38.855900 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855066 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:16:38.855900 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855072 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:16:38.855900 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855077 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:16:38.855900 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855081 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:16:38.855900 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855085 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:16:38.855900 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855089 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:16:38.856584 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855094 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:16:38.856584 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855098 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:16:38.856584 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855102 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:16:38.856584 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855106 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:16:38.856584 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855110 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:16:38.856584 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855114 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:16:38.856584 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855119 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:16:38.856584 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855123 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:16:38.856584 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855128 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:16:38.856584 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855131 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:16:38.856584 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855135 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:16:38.856584 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855140 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:16:38.856584 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855144 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:16:38.856584 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855148 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:16:38.856584 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855152 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:16:38.856584 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855156 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:16:38.856584 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855160 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:16:38.856584 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855164 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:16:38.856584 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855169 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:16:38.856584 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855173 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:16:38.857067 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855177 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:16:38.857067 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855184 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:16:38.857067 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855188 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:16:38.857067 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855192 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:16:38.857067 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855196 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:16:38.857067 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855199 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:16:38.857067 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855204 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:16:38.857067 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855209 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:16:38.857067 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855213 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:16:38.857067 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855218 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:16:38.857067 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855222 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:16:38.857067 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855226 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:16:38.857067 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855232 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:16:38.857067 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855236 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:16:38.857067 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855240 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:16:38.857067 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855244 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:16:38.857067 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855248 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:16:38.857067 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855252 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:16:38.857067 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855256 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:16:38.857067 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855260 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:16:38.857536 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855267 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:16:38.857536 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855272 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:16:38.857536 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855277 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:16:38.857536 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855282 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:16:38.857536 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855887 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:16:38.857536 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855896 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:16:38.857536 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855902 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:16:38.857536 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855906 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:16:38.857536 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855910 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:16:38.857536 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855913 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:16:38.857536 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855917 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:16:38.857536 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855921 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:16:38.857536 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855924 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:16:38.857536 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855928 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:16:38.857536 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855931 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:16:38.857536 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855935 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:16:38.857536 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855939 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:16:38.857536 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855943 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:16:38.857536 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855949 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:16:38.858341 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855954 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:16:38.858341 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855958 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:16:38.858341 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855963 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:16:38.858341 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855967 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:16:38.858341 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855971 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:16:38.858341 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855976 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:16:38.858341 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855980 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:16:38.858341 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855985 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:16:38.858341 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855989 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:16:38.858341 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855994 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:16:38.858341 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.855998 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:16:38.858341 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856002 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:16:38.858341 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856006 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:16:38.858341 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856010 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:16:38.858341 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856014 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:16:38.858341 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856017 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:16:38.858341 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856021 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:16:38.858341 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856026 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:16:38.858341 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856030 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:16:38.858341 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856034 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:16:38.859122 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856038 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:16:38.859122 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856042 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:16:38.859122 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856046 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:16:38.859122 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856050 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:16:38.859122 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856054 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:16:38.859122 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856058 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:16:38.859122 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856062 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:16:38.859122 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856066 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:16:38.859122 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856070 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:16:38.859122 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856075 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:16:38.859122 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856079 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:16:38.859122 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856084 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:16:38.859122 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856090 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:16:38.859122 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856096 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:16:38.859122 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856101 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:16:38.859122 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856106 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:16:38.859122 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856110 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:16:38.859122 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856114 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:16:38.859122 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856121 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:16:38.859122 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856125 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:16:38.859622 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856130 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:16:38.859622 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856135 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:16:38.859622 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856143 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:16:38.859622 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856148 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:16:38.859622 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856153 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:16:38.859622 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856157 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:16:38.859622 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856161 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:16:38.859622 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856165 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:16:38.859622 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856169 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:16:38.859622 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856173 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:16:38.859622 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856177 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:16:38.859622 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856181 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:16:38.859622 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856185 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:16:38.859622 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856189 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:16:38.859622 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856193 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:16:38.859622 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856197 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:16:38.859622 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856201 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:16:38.859622 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856206 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:16:38.859622 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856209 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:16:38.860076 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856214 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:16:38.860076 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856218 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:16:38.860076 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856222 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:16:38.860076 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856226 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:16:38.860076 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856230 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:16:38.860076 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856234 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:16:38.860076 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856240 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:16:38.860076 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856244 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:16:38.860076 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856248 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:16:38.860076 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856253 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:16:38.860076 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856257 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:16:38.860076 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.856260 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:16:38.860076 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857628 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:16:38.860076 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857644 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:16:38.860076 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857653 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:16:38.860076 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857660 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:16:38.860076 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857667 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:16:38.860076 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857673 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:16:38.860076 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857681 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:16:38.860076 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857688 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:16:38.860076 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857693 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857698 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857703 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857708 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857713 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857718 2573 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857723 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857728 2573 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857733 2573 flags.go:64] FLAG: --cloud-config="" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857738 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857743 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857750 2573 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857755 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857760 2573 flags.go:64] FLAG: --config-dir="" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857764 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857769 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857776 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857780 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857786 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857791 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857796 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857800 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857805 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857810 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857815 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:16:38.860597 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857823 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857829 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857834 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857839 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857844 2573 flags.go:64] FLAG: --enable-server="true" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857848 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857854 2573 flags.go:64] FLAG: --event-burst="100" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857859 2573 flags.go:64] FLAG: --event-qps="50" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857864 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857869 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857874 2573 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857880 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857885 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857889 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857894 2573 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857899 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857903 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857908 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857913 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857918 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857922 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857927 2573 flags.go:64] FLAG: --feature-gates="" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857933 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857938 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857943 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:16:38.861252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857948 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857954 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857959 2573 flags.go:64] FLAG: --help="false" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857963 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-131-203.ec2.internal" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857968 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857972 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857977 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857983 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857989 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.857995 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858001 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858006 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858011 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858015 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858020 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858025 2573 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858030 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858034 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858039 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858044 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858048 2573 flags.go:64] FLAG: --lock-file="" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858053 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858057 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858062 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:16:38.862087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858071 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:16:38.862669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858076 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:16:38.862669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858080 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:16:38.862669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858085 2573 flags.go:64] FLAG: --logging-format="text" Apr 16 18:16:38.862669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858089 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:16:38.862669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858094 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:16:38.862669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858103 2573 flags.go:64] FLAG: --manifest-url="" Apr 16 18:16:38.862669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858108 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:16:38.862669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858115 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:16:38.862669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858120 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:16:38.862669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858126 2573 flags.go:64] FLAG: --max-pods="110" Apr 16 18:16:38.862669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858130 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:16:38.862669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858135 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:16:38.862669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858139 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:16:38.862669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858144 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:16:38.862669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858149 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:16:38.862669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858154 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:16:38.862669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858160 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:16:38.862669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858171 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:16:38.862669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858176 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:16:38.862669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858181 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:16:38.862669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858186 2573 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:16:38.862669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858190 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:16:38.862669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858199 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858204 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858209 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858213 2573 flags.go:64] FLAG: --port="10250" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858218 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858223 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f866a8c09e6079da" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858228 2573 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858233 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858238 2573 flags.go:64] FLAG: --register-node="true" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858242 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858247 2573 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858253 2573 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858258 2573 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858262 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858267 2573 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858277 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858284 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858289 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858293 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858298 2573 flags.go:64] FLAG: --runonce="false" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858302 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858307 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858312 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858316 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858321 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858326 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:16:38.863300 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858331 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858338 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858343 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858347 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858351 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858356 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858361 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858365 2573 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858371 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858380 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858384 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858389 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858396 2573 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858400 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858404 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858410 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858415 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858419 2573 flags.go:64] FLAG: --v="2" Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858426 2573 flags.go:64] FLAG: --version="false" Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858432 2573 flags.go:64] FLAG: --vmodule="" Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858438 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.858444 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858610 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858618 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:16:38.863957 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858624 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:16:38.864520 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858629 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:16:38.864520 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858635 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:16:38.864520 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858640 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:16:38.864520 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858645 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:16:38.864520 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858649 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:16:38.864520 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858653 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:16:38.864520 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858658 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:16:38.864520 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858663 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:16:38.864520 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858667 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:16:38.864520 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858672 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:16:38.864520 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858676 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:16:38.864520 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858680 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:16:38.864520 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858685 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:16:38.864520 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858689 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:16:38.864520 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858693 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:16:38.864520 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858697 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:16:38.864520 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858701 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:16:38.864520 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858706 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:16:38.864520 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858710 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:16:38.864520 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858714 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:16:38.865029 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858718 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:16:38.865029 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858722 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:16:38.865029 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858726 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:16:38.865029 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858730 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:16:38.865029 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858734 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:16:38.865029 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858737 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:16:38.865029 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858742 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:16:38.865029 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858746 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:16:38.865029 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858750 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:16:38.865029 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858756 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:16:38.865029 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858760 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:16:38.865029 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858764 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:16:38.865029 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858768 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:16:38.865029 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858773 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:16:38.865029 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858777 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:16:38.865029 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858780 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:16:38.865029 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858785 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:16:38.865029 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858791 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:16:38.865029 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858795 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:16:38.865029 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858799 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:16:38.865507 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858803 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:16:38.865507 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858807 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:16:38.865507 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858812 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:16:38.865507 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858816 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:16:38.865507 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858821 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:16:38.865507 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858825 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:16:38.865507 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858829 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:16:38.865507 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858833 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:16:38.865507 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858837 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:16:38.865507 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858841 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:16:38.865507 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858845 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:16:38.865507 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858849 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:16:38.865507 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858853 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:16:38.865507 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858858 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:16:38.865507 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858862 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:16:38.865507 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858866 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:16:38.865507 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858870 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:16:38.865507 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858873 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:16:38.865507 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858877 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:16:38.866050 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858881 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:16:38.866050 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858885 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:16:38.866050 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858891 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:16:38.866050 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858895 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:16:38.866050 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858899 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:16:38.866050 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858903 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:16:38.866050 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858906 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:16:38.866050 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858911 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:16:38.866050 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858915 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:16:38.866050 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858919 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:16:38.866050 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858926 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:16:38.866050 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858930 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:16:38.866050 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858934 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:16:38.866050 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858938 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:16:38.866050 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858945 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:16:38.866050 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858950 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:16:38.866050 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858956 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:16:38.866050 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858960 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:16:38.866050 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858965 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:16:38.866504 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858969 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:16:38.866504 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858974 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:16:38.866504 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858978 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:16:38.866504 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858982 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:16:38.866504 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.858986 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:16:38.866504 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.860569 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:16:38.868304 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.868284 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:16:38.868344 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.868305 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:16:38.868371 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868363 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:16:38.868371 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868368 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:16:38.868422 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868372 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:16:38.868422 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868376 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:16:38.868422 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868378 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:16:38.868422 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868381 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:16:38.868422 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868384 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:16:38.868422 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868386 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:16:38.868422 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868390 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:16:38.868422 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868392 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:16:38.868422 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868395 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:16:38.868422 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868397 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:16:38.868422 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868400 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:16:38.868422 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868402 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:16:38.868422 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868405 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:16:38.868422 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868407 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:16:38.868422 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868410 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:16:38.868422 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868412 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:16:38.868422 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868415 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:16:38.868422 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868419 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:16:38.868422 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868421 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:16:38.868422 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868424 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:16:38.868937 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868426 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:16:38.868937 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868429 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:16:38.868937 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868432 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:16:38.868937 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868435 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:16:38.868937 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868438 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:16:38.868937 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868440 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:16:38.868937 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868443 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:16:38.868937 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868445 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:16:38.868937 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868448 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:16:38.868937 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868455 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:16:38.868937 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868458 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:16:38.868937 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868460 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:16:38.868937 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868463 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:16:38.868937 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868465 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:16:38.868937 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868469 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:16:38.868937 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868473 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:16:38.868937 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868476 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:16:38.868937 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868479 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:16:38.868937 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868482 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:16:38.869387 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868485 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:16:38.869387 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868487 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:16:38.869387 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868490 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:16:38.869387 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868493 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:16:38.869387 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868496 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:16:38.869387 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868498 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:16:38.869387 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868501 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:16:38.869387 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868503 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:16:38.869387 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868506 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:16:38.869387 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868508 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:16:38.869387 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868511 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:16:38.869387 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868513 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:16:38.869387 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868516 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:16:38.869387 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868518 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:16:38.869387 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868521 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:16:38.869387 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868523 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:16:38.869387 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868526 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:16:38.869387 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868528 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:16:38.869387 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868531 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:16:38.869387 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868534 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:16:38.869894 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868536 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:16:38.869894 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868538 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:16:38.869894 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868554 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:16:38.869894 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868558 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:16:38.869894 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868560 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:16:38.869894 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868563 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:16:38.869894 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868566 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:16:38.869894 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868568 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:16:38.869894 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868571 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:16:38.869894 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868574 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:16:38.869894 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868578 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:16:38.869894 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868581 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:16:38.869894 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868583 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:16:38.869894 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868586 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:16:38.869894 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868588 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:16:38.869894 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868591 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:16:38.869894 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868594 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:16:38.869894 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868596 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:16:38.869894 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868599 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:16:38.870334 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868601 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:16:38.870334 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868604 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:16:38.870334 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868606 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:16:38.870334 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868609 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:16:38.870334 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868611 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:16:38.870334 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868614 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:16:38.870334 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.868619 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:16:38.870334 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868734 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:16:38.870334 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868740 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:16:38.870334 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868743 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:16:38.870334 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868747 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:16:38.870334 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868749 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:16:38.870334 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868752 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:16:38.870334 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868755 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:16:38.870334 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868757 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:16:38.870334 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868760 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:16:38.870739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868762 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:16:38.870739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868765 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:16:38.870739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868768 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:16:38.870739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868770 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:16:38.870739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868773 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:16:38.870739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868775 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:16:38.870739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868777 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:16:38.870739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868780 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:16:38.870739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868783 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:16:38.870739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868788 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:16:38.870739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868791 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:16:38.870739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868793 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:16:38.870739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868797 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:16:38.870739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868799 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:16:38.870739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868802 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:16:38.870739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868804 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:16:38.870739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868806 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:16:38.870739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868809 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:16:38.870739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868811 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:16:38.870739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868814 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:16:38.871201 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868816 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:16:38.871201 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868819 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:16:38.871201 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868821 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:16:38.871201 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868824 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:16:38.871201 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868827 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:16:38.871201 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868829 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:16:38.871201 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868832 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:16:38.871201 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868834 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:16:38.871201 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868837 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:16:38.871201 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868839 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:16:38.871201 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868843 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:16:38.871201 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868846 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:16:38.871201 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868849 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:16:38.871201 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868852 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:16:38.871201 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868861 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:16:38.871201 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868863 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:16:38.871201 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868866 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:16:38.871201 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868868 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:16:38.871201 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868871 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:16:38.871739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868874 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:16:38.871739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868876 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:16:38.871739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868879 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:16:38.871739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868881 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:16:38.871739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868883 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:16:38.871739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868886 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:16:38.871739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868889 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:16:38.871739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868891 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:16:38.871739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868894 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:16:38.871739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868896 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:16:38.871739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868899 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:16:38.871739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868901 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:16:38.871739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868904 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:16:38.871739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868906 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:16:38.871739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868908 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:16:38.871739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868911 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:16:38.871739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868913 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:16:38.871739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868916 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:16:38.871739 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868918 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:16:38.872177 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868921 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:16:38.872177 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868923 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:16:38.872177 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868926 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:16:38.872177 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868928 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:16:38.872177 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868931 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:16:38.872177 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868933 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:16:38.872177 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868936 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:16:38.872177 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868938 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:16:38.872177 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868941 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:16:38.872177 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868949 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:16:38.872177 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868951 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:16:38.872177 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868954 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:16:38.872177 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868956 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:16:38.872177 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868959 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:16:38.872177 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868961 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:16:38.872177 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868963 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:16:38.872177 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868966 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:16:38.872177 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868968 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:16:38.872177 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:38.868971 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:16:38.872636 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.868976 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:16:38.872636 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.869879 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:16:38.872636 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.872118 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:16:38.873337 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.873325 2573 server.go:1019] "Starting client certificate rotation" Apr 16 18:16:38.873441 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.873424 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:16:38.873473 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.873464 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:16:38.900917 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.900902 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:16:38.903590 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.903535 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:16:38.925897 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.925879 2573 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:16:38.931814 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.931800 2573 log.go:25] "Validated CRI v1 image API" Apr 16 18:16:38.933135 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.933119 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:16:38.934054 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.934037 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:16:38.937985 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.937968 2573 fs.go:135] Filesystem UUIDs: map[5400fe31-6882-404c-b577-b0099154727d:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 e4572613-2401-4e4e-9a52-1fca238bf0cd:/dev/nvme0n1p3] Apr 16 18:16:38.938048 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.937986 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:16:38.943574 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.943457 2573 manager.go:217] Machine: {Timestamp:2026-04-16 18:16:38.941526637 +0000 UTC m=+0.448791744 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3195872 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec20619c1f87d508527f848bcacf7dbb SystemUUID:ec20619c-1f87-d508-527f-848bcacf7dbb BootID:d40e2014-6e1f-4a16-b3a7-8c249a74a86d Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:26:67:70:4a:77 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:26:67:70:4a:77 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:36:0c:37:78:ac:93 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:16:38.943574 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.943568 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:16:38.943679 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.943665 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:16:38.944662 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.944639 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:16:38.944806 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.944665 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-203.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:16:38.944847 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.944814 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:16:38.944847 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.944822 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:16:38.944847 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.944839 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:16:38.944920 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.944854 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:16:38.947369 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.947359 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:16:38.947626 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.947617 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:16:38.950126 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.950116 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:16:38.950157 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.950135 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:16:38.950157 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.950146 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:16:38.950157 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.950155 2573 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:16:38.950233 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.950163 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:16:38.951231 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.951220 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:16:38.951275 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.951237 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:16:38.952132 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.952105 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9k8kr" Apr 16 18:16:38.954464 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.954447 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:16:38.956325 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.956310 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:16:38.957944 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.957928 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:16:38.957944 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.957947 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:16:38.958044 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.957957 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:16:38.958044 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.957968 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:16:38.958044 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.957975 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:16:38.958044 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.957983 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:16:38.958044 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.957991 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:16:38.958044 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.957998 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:16:38.958044 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.958007 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:16:38.958044 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.958013 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:16:38.958044 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.958033 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:16:38.958044 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.958041 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:16:38.958286 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.958104 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9k8kr" Apr 16 18:16:38.960096 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.960081 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:16:38.960096 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.960097 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:16:38.960724 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:38.960691 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:16:38.960768 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:38.960736 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-203.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:16:38.963419 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.963404 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:16:38.963492 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.963437 2573 server.go:1295] "Started kubelet" Apr 16 18:16:38.963590 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.963564 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:16:38.963625 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.963566 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:16:38.963672 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.963656 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:16:38.964415 ip-10-0-131-203 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:16:38.967613 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.967560 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:16:38.968328 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.968309 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:16:38.973084 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.972947 2573 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-203.ec2.internal" not found Apr 16 18:16:38.973706 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:38.973685 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:16:38.974109 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.974091 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:16:38.974697 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.974685 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:16:38.975469 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.975451 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:16:38.976317 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:38.976293 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-203.ec2.internal\" not found" Apr 16 18:16:38.976386 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.976330 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:16:38.976386 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.976345 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:16:38.976470 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.976432 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:16:38.976470 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.976443 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:16:38.976608 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.976595 2573 factory.go:55] Registering systemd factory Apr 16 18:16:38.976665 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.976612 2573 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:16:38.976963 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.976949 2573 factory.go:153] Registering CRI-O factory Apr 16 18:16:38.977107 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.976967 2573 factory.go:223] Registration of the crio container factory successfully Apr 16 18:16:38.977107 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.977076 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:16:38.977107 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.977101 2573 factory.go:103] Registering Raw factory Apr 16 18:16:38.977244 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.977118 2573 manager.go:1196] Started watching for new ooms in manager Apr 16 18:16:38.977325 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.977077 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:16:38.978137 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.978121 2573 manager.go:319] Starting recovery of all containers Apr 16 18:16:38.980590 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:38.980567 2573 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-203.ec2.internal\" not found" node="ip-10-0-131-203.ec2.internal" Apr 16 18:16:38.986899 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.986888 2573 manager.go:324] Recovery completed Apr 16 18:16:38.990648 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.990636 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:16:38.992849 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.992832 2573 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-203.ec2.internal" not found Apr 16 18:16:38.993131 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.993113 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-203.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:16:38.993207 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.993144 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-203.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:16:38.993207 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.993154 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-203.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:16:38.993700 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.993688 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:16:38.993700 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.993698 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:16:38.993784 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.993715 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:16:38.996868 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.996857 2573 policy_none.go:49] "None policy: Start" Apr 16 18:16:38.996948 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.996872 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:16:38.996948 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:38.996881 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:16:39.046686 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.046070 2573 manager.go:341] "Starting Device Plugin manager" Apr 16 18:16:39.046686 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:39.046100 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:16:39.046686 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.046113 2573 server.go:85] "Starting device plugin registration server" Apr 16 18:16:39.046686 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.046325 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:16:39.046686 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.046335 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:16:39.046686 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.046425 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:16:39.046686 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.046512 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:16:39.046686 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.046521 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:16:39.047041 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:39.047013 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:16:39.047075 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:39.047044 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-203.ec2.internal\" not found" Apr 16 18:16:39.053185 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.053172 2573 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-203.ec2.internal" not found Apr 16 18:16:39.115310 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.115252 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:16:39.116367 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.116354 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:16:39.116459 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.116375 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:16:39.116459 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.116393 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:16:39.116459 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.116402 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:16:39.116459 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:39.116430 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:16:39.119712 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.119698 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:16:39.146809 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.146798 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:16:39.147573 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.147560 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-203.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:16:39.147634 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.147584 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-203.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:16:39.147634 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.147593 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-203.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:16:39.147634 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.147613 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-203.ec2.internal" Apr 16 18:16:39.157952 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.157930 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-203.ec2.internal" Apr 16 18:16:39.158032 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:39.157962 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-203.ec2.internal\": node \"ip-10-0-131-203.ec2.internal\" not found" Apr 16 18:16:39.182571 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:39.182538 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-203.ec2.internal\" not found" Apr 16 18:16:39.217073 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.217029 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-203.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-203.ec2.internal"] Apr 16 18:16:39.217122 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.217110 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:16:39.217884 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.217870 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-203.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:16:39.217936 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.217898 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-203.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:16:39.217936 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.217907 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-203.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:16:39.219163 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.219152 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:16:39.219292 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.219278 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-203.ec2.internal" Apr 16 18:16:39.219338 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.219304 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:16:39.219810 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.219798 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-203.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:16:39.219882 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.219811 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-203.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:16:39.219882 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.219836 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-203.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:16:39.219882 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.219849 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-203.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:16:39.220010 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.219817 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-203.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:16:39.220010 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.219895 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-203.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:16:39.221616 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.221592 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-203.ec2.internal" Apr 16 18:16:39.221697 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.221619 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:16:39.222260 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.222239 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-203.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:16:39.222311 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.222271 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-203.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:16:39.222311 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.222284 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-203.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:16:39.243076 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:39.243058 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-203.ec2.internal\" not found" node="ip-10-0-131-203.ec2.internal" Apr 16 18:16:39.247124 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:39.247110 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-203.ec2.internal\" not found" node="ip-10-0-131-203.ec2.internal" Apr 16 18:16:39.279414 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.279395 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0e77e2e37a581a4f0ed7090fbdb2ba3d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-203.ec2.internal\" (UID: \"0e77e2e37a581a4f0ed7090fbdb2ba3d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-203.ec2.internal" Apr 16 18:16:39.279475 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.279420 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0e77e2e37a581a4f0ed7090fbdb2ba3d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-203.ec2.internal\" (UID: \"0e77e2e37a581a4f0ed7090fbdb2ba3d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-203.ec2.internal" Apr 16 18:16:39.279475 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.279439 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cad5fa07f782ca2be3a004d69f182f5f-config\") pod \"kube-apiserver-proxy-ip-10-0-131-203.ec2.internal\" (UID: \"cad5fa07f782ca2be3a004d69f182f5f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-203.ec2.internal" Apr 16 18:16:39.283086 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:39.283074 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-203.ec2.internal\" not found" Apr 16 18:16:39.379927 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.379872 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0e77e2e37a581a4f0ed7090fbdb2ba3d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-203.ec2.internal\" (UID: \"0e77e2e37a581a4f0ed7090fbdb2ba3d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-203.ec2.internal" Apr 16 18:16:39.379927 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.379909 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0e77e2e37a581a4f0ed7090fbdb2ba3d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-203.ec2.internal\" (UID: \"0e77e2e37a581a4f0ed7090fbdb2ba3d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-203.ec2.internal" Apr 16 18:16:39.379927 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.379925 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cad5fa07f782ca2be3a004d69f182f5f-config\") pod \"kube-apiserver-proxy-ip-10-0-131-203.ec2.internal\" (UID: \"cad5fa07f782ca2be3a004d69f182f5f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-203.ec2.internal" Apr 16 18:16:39.380075 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.379995 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0e77e2e37a581a4f0ed7090fbdb2ba3d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-203.ec2.internal\" (UID: \"0e77e2e37a581a4f0ed7090fbdb2ba3d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-203.ec2.internal" Apr 16 18:16:39.380075 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.380026 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cad5fa07f782ca2be3a004d69f182f5f-config\") pod \"kube-apiserver-proxy-ip-10-0-131-203.ec2.internal\" (UID: \"cad5fa07f782ca2be3a004d69f182f5f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-203.ec2.internal" Apr 16 18:16:39.380075 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.380056 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0e77e2e37a581a4f0ed7090fbdb2ba3d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-203.ec2.internal\" (UID: \"0e77e2e37a581a4f0ed7090fbdb2ba3d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-203.ec2.internal" Apr 16 18:16:39.383997 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:39.383978 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-203.ec2.internal\" not found" Apr 16 18:16:39.484382 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:39.484357 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-203.ec2.internal\" not found" Apr 16 18:16:39.546494 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.546467 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-203.ec2.internal" Apr 16 18:16:39.550212 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.550194 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-203.ec2.internal" Apr 16 18:16:39.585393 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:39.585371 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-203.ec2.internal\" not found" Apr 16 18:16:39.685890 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:39.685828 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-203.ec2.internal\" not found" Apr 16 18:16:39.786381 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:39.786356 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-203.ec2.internal\" not found" Apr 16 18:16:39.796100 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.796081 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:16:39.874007 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.873980 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:16:39.874661 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.874108 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:16:39.874661 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.874153 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:16:39.874661 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.874155 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:16:39.875079 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.875061 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-203.ec2.internal" Apr 16 18:16:39.895306 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.895285 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:16:39.897065 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.897053 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-203.ec2.internal" Apr 16 18:16:39.904112 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.904098 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:16:39.951270 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.951251 2573 apiserver.go:52] "Watching apiserver" Apr 16 18:16:39.959075 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.959054 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:16:39.959423 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.959402 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-kq8d4","kube-system/kube-apiserver-proxy-ip-10-0-131-203.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn","openshift-image-registry/node-ca-d89lv","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-203.ec2.internal","openshift-multus/network-metrics-daemon-sw2bl","openshift-network-diagnostics/network-check-target-jgrxb","openshift-network-operator/iptables-alerter-t6c2c","openshift-ovn-kubernetes/ovnkube-node-tj7xv","kube-system/konnectivity-agent-c6ncp","openshift-cluster-node-tuning-operator/tuned-bsv9k","openshift-dns/node-resolver-ll55v","openshift-multus/multus-additional-cni-plugins-lp5l8"] Apr 16 18:16:39.960347 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.960319 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:11:38 +0000 UTC" deadline="2027-12-10 08:15:02.658556271 +0000 UTC" Apr 16 18:16:39.960347 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.960344 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14461h58m22.698214863s" Apr 16 18:16:39.962288 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.961238 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kq8d4" Apr 16 18:16:39.963349 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.963322 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" Apr 16 18:16:39.963844 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.963823 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:16:39.963944 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.963857 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:16:39.964019 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.963996 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qfbmz\"" Apr 16 18:16:39.964122 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.964109 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:16:39.964223 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.964209 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:16:39.964676 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.964656 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d89lv" Apr 16 18:16:39.965427 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.965411 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-zdnjn\"" Apr 16 18:16:39.965499 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.965475 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:16:39.965656 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.965639 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:16:39.965715 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.965675 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:16:39.966674 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.966658 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:16:39.966857 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.966843 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:16:39.966932 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.966869 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:16:39.967094 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.967042 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:16:39.967094 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.967082 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-w7mkd\"" Apr 16 18:16:39.967094 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.967093 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:16:39.967250 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:39.967136 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgrxb" podUID="2571d812-7882-455b-be2f-4e3888df0e6a" Apr 16 18:16:39.967250 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:39.967132 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sw2bl" podUID="a7930c14-4ef0-4949-a2ae-9a240da66c3c" Apr 16 18:16:39.968304 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.968290 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-t6c2c" Apr 16 18:16:39.971229 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.971210 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:16:39.971351 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.971333 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:16:39.971404 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.971373 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:16:39.971511 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.971271 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-hnjqb\"" Apr 16 18:16:39.973151 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.973134 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:39.973371 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.973354 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-c6ncp" Apr 16 18:16:39.974461 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.974445 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:16:39.974560 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.974475 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:39.975069 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.975052 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:16:39.975367 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.975351 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hvk9x\"" Apr 16 18:16:39.975485 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.975464 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:16:39.975485 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.975478 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:16:39.975852 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.975626 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:16:39.975852 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.975814 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ll55v" Apr 16 18:16:39.975969 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.975873 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:16:39.975969 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.975917 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:16:39.976121 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.976103 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:16:39.976173 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.976164 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:16:39.976392 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.976378 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wzzrc\"" Apr 16 18:16:39.976953 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.976934 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:16:39.977058 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.976943 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:16:39.977058 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.976984 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qw4f9\"" Apr 16 18:16:39.977291 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.977274 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:39.978590 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.978569 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:16:39.978681 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.978624 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-2xs7s\"" Apr 16 18:16:39.978910 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.978893 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:16:39.979431 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.979369 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:16:39.979511 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.979434 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-kgn4n\"" Apr 16 18:16:39.979511 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.979435 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:16:39.983611 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.983592 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-run-ovn\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:39.983692 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.983623 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:39.983692 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.983640 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-host-run-k8s-cni-cncf-io\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:39.983692 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.983657 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-multus-conf-dir\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:39.983692 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.983678 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-run\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:39.983830 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.983716 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c7893c1b-506b-4195-91f7-dc9927cb3a36-device-dir\") pod \"aws-ebs-csi-driver-node-6bmgn\" (UID: \"c7893c1b-506b-4195-91f7-dc9927cb3a36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" Apr 16 18:16:39.983830 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.983741 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c7893c1b-506b-4195-91f7-dc9927cb3a36-etc-selinux\") pod \"aws-ebs-csi-driver-node-6bmgn\" (UID: \"c7893c1b-506b-4195-91f7-dc9927cb3a36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" Apr 16 18:16:39.983830 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.983761 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/99eb116c-fa90-4ac9-a593-ec208e5f2f43-konnectivity-ca\") pod \"konnectivity-agent-c6ncp\" (UID: \"99eb116c-fa90-4ac9-a593-ec208e5f2f43\") " pod="kube-system/konnectivity-agent-c6ncp" Apr 16 18:16:39.983934 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.983825 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt8nk\" (UniqueName: \"kubernetes.io/projected/18fc33d6-c4dd-487b-8457-811880ffd3ea-kube-api-access-xt8nk\") pod \"node-resolver-ll55v\" (UID: \"18fc33d6-c4dd-487b-8457-811880ffd3ea\") " pod="openshift-dns/node-resolver-ll55v" Apr 16 18:16:39.983934 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.983859 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c4168765-8ac0-4395-a56d-b2991fa122e3-cni-binary-copy\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:39.983934 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.983878 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv4l5\" (UniqueName: \"kubernetes.io/projected/a7930c14-4ef0-4949-a2ae-9a240da66c3c-kube-api-access-hv4l5\") pod \"network-metrics-daemon-sw2bl\" (UID: \"a7930c14-4ef0-4949-a2ae-9a240da66c3c\") " pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:16:39.983934 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.983892 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-etc-systemd\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:39.983934 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.983914 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-system-cni-dir\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:39.984097 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.983950 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-multus-cni-dir\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:39.984097 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.983970 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-host-var-lib-cni-multus\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:39.984097 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.983986 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5df74\" (UniqueName: \"kubernetes.io/projected/5146021c-a86d-4b5f-a47d-7f8c736f756e-kube-api-access-5df74\") pod \"node-ca-d89lv\" (UID: \"5146021c-a86d-4b5f-a47d-7f8c736f756e\") " pod="openshift-image-registry/node-ca-d89lv" Apr 16 18:16:39.984097 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984007 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-tmp\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:39.984097 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984026 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-host-kubelet\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:39.984097 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984042 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-host-run-ovn-kubernetes\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:39.984097 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984056 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-ovnkube-script-lib\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:39.984097 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984075 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-os-release\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:39.984417 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984100 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-lib-modules\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:39.984417 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984116 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j7f6\" (UniqueName: \"kubernetes.io/projected/24c29d62-5808-4ee7-92cd-ce4e68faf741-kube-api-access-5j7f6\") pod \"iptables-alerter-t6c2c\" (UID: \"24c29d62-5808-4ee7-92cd-ce4e68faf741\") " pod="openshift-network-operator/iptables-alerter-t6c2c" Apr 16 18:16:39.984417 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984132 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-etc-openvswitch\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:39.984417 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984151 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzmjc\" (UniqueName: \"kubernetes.io/projected/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-kube-api-access-jzmjc\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:39.984417 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984169 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-etc-kubernetes\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:39.984417 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984184 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-host-slash\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:39.984417 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984204 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-host-cni-bin\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:39.984417 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984224 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-var-lib-kubelet\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:39.984417 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984270 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c7893c1b-506b-4195-91f7-dc9927cb3a36-socket-dir\") pod \"aws-ebs-csi-driver-node-6bmgn\" (UID: \"c7893c1b-506b-4195-91f7-dc9927cb3a36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" Apr 16 18:16:39.984417 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984311 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c7893c1b-506b-4195-91f7-dc9927cb3a36-registration-dir\") pod \"aws-ebs-csi-driver-node-6bmgn\" (UID: \"c7893c1b-506b-4195-91f7-dc9927cb3a36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" Apr 16 18:16:39.984417 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984328 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-log-socket\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:39.984417 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984364 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-host-cni-netd\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:39.984417 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984384 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-host-var-lib-kubelet\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:39.984417 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984383 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:16:39.984417 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984397 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c4168765-8ac0-4395-a56d-b2991fa122e3-multus-daemon-config\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:39.984417 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984424 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5146021c-a86d-4b5f-a47d-7f8c736f756e-host\") pod \"node-ca-d89lv\" (UID: \"5146021c-a86d-4b5f-a47d-7f8c736f756e\") " pod="openshift-image-registry/node-ca-d89lv" Apr 16 18:16:39.985078 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984440 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-run-systemd\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:39.985078 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984473 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-run-openvswitch\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:39.985078 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984496 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/99eb116c-fa90-4ac9-a593-ec208e5f2f43-agent-certs\") pod \"konnectivity-agent-c6ncp\" (UID: \"99eb116c-fa90-4ac9-a593-ec208e5f2f43\") " pod="kube-system/konnectivity-agent-c6ncp" Apr 16 18:16:39.985078 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984511 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clznr\" (UniqueName: \"kubernetes.io/projected/c7893c1b-506b-4195-91f7-dc9927cb3a36-kube-api-access-clznr\") pod \"aws-ebs-csi-driver-node-6bmgn\" (UID: \"c7893c1b-506b-4195-91f7-dc9927cb3a36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" Apr 16 18:16:39.985078 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984526 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-host-var-lib-cni-bin\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:39.985078 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984579 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-etc-kubernetes\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:39.985078 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984619 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs\") pod \"network-metrics-daemon-sw2bl\" (UID: \"a7930c14-4ef0-4949-a2ae-9a240da66c3c\") " pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:16:39.985078 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984642 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/24c29d62-5808-4ee7-92cd-ce4e68faf741-host-slash\") pod \"iptables-alerter-t6c2c\" (UID: \"24c29d62-5808-4ee7-92cd-ce4e68faf741\") " pod="openshift-network-operator/iptables-alerter-t6c2c" Apr 16 18:16:39.985078 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984673 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-systemd-units\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:39.985078 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984698 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-host-run-netns\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:39.985078 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984721 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-ovnkube-config\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:39.985078 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984757 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-etc-modprobe-d\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:39.985078 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984788 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-sys\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:39.985078 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984812 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-node-log\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:39.985078 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984835 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-hostroot\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:39.985078 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984858 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fwd4\" (UniqueName: \"kubernetes.io/projected/c4168765-8ac0-4395-a56d-b2991fa122e3-kube-api-access-9fwd4\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:39.985078 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984881 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-etc-sysconfig\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:39.985688 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984905 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-etc-sysctl-d\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:39.985688 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984928 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-etc-sysctl-conf\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:39.985688 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984957 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-etc-tuned\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:39.985688 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.984995 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-ovn-node-metrics-cert\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:39.985688 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.985036 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/18fc33d6-c4dd-487b-8457-811880ffd3ea-hosts-file\") pod \"node-resolver-ll55v\" (UID: \"18fc33d6-c4dd-487b-8457-811880ffd3ea\") " pod="openshift-dns/node-resolver-ll55v" Apr 16 18:16:39.985688 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.985105 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-multus-socket-dir-parent\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:39.985688 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.985143 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgmln\" (UniqueName: \"kubernetes.io/projected/2571d812-7882-455b-be2f-4e3888df0e6a-kube-api-access-wgmln\") pod \"network-check-target-jgrxb\" (UID: \"2571d812-7882-455b-be2f-4e3888df0e6a\") " pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:16:39.985688 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.985168 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/24c29d62-5808-4ee7-92cd-ce4e68faf741-iptables-alerter-script\") pod \"iptables-alerter-t6c2c\" (UID: \"24c29d62-5808-4ee7-92cd-ce4e68faf741\") " pod="openshift-network-operator/iptables-alerter-t6c2c" Apr 16 18:16:39.985688 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.985190 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-host-run-multus-certs\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:39.985688 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.985205 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72fn5\" (UniqueName: \"kubernetes.io/projected/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-kube-api-access-72fn5\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:39.985688 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.985238 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-var-lib-openvswitch\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:39.985688 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.985275 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/18fc33d6-c4dd-487b-8457-811880ffd3ea-tmp-dir\") pod \"node-resolver-ll55v\" (UID: \"18fc33d6-c4dd-487b-8457-811880ffd3ea\") " pod="openshift-dns/node-resolver-ll55v" Apr 16 18:16:39.985688 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.985309 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-host-run-netns\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:39.985688 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.985344 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-env-overrides\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:39.985688 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.985366 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-cnibin\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:39.985688 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.985400 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5146021c-a86d-4b5f-a47d-7f8c736f756e-serviceca\") pod \"node-ca-d89lv\" (UID: \"5146021c-a86d-4b5f-a47d-7f8c736f756e\") " pod="openshift-image-registry/node-ca-d89lv" Apr 16 18:16:39.985688 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.985429 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-host\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:39.986133 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.985453 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7893c1b-506b-4195-91f7-dc9927cb3a36-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6bmgn\" (UID: \"c7893c1b-506b-4195-91f7-dc9927cb3a36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" Apr 16 18:16:39.986133 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:39.985476 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c7893c1b-506b-4195-91f7-dc9927cb3a36-sys-fs\") pod \"aws-ebs-csi-driver-node-6bmgn\" (UID: \"c7893c1b-506b-4195-91f7-dc9927cb3a36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" Apr 16 18:16:40.013339 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.013319 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-lnccb" Apr 16 18:16:40.019607 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.019588 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-lnccb" Apr 16 18:16:40.048469 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:40.048444 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcad5fa07f782ca2be3a004d69f182f5f.slice/crio-b8459c8d98d2773426774db322a439bb6bb69b62102ef1c43e781beeb5833aa6 WatchSource:0}: Error finding container b8459c8d98d2773426774db322a439bb6bb69b62102ef1c43e781beeb5833aa6: Status 404 returned error can't find the container with id b8459c8d98d2773426774db322a439bb6bb69b62102ef1c43e781beeb5833aa6 Apr 16 18:16:40.048969 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:40.048953 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e77e2e37a581a4f0ed7090fbdb2ba3d.slice/crio-f5331adeed5e45de8e87cc8b9d45681788b4d1a250ee41f69f0dc25709fdaef7 WatchSource:0}: Error finding container f5331adeed5e45de8e87cc8b9d45681788b4d1a250ee41f69f0dc25709fdaef7: Status 404 returned error can't find the container with id f5331adeed5e45de8e87cc8b9d45681788b4d1a250ee41f69f0dc25709fdaef7 Apr 16 18:16:40.054026 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.054011 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:16:40.075963 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.075947 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:16:40.085871 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.085850 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-etc-openvswitch\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.085939 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.085875 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jzmjc\" (UniqueName: \"kubernetes.io/projected/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-kube-api-access-jzmjc\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.085939 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.085893 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-etc-kubernetes\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.085939 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.085907 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-host-slash\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.086057 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.085949 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-etc-kubernetes\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.086057 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.085984 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-etc-openvswitch\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.086149 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086111 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-host-slash\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.086149 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086144 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-host-cni-bin\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.086254 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086172 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-var-lib-kubelet\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.086254 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086193 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-host-cni-bin\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.086254 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086196 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c7893c1b-506b-4195-91f7-dc9927cb3a36-socket-dir\") pod \"aws-ebs-csi-driver-node-6bmgn\" (UID: \"c7893c1b-506b-4195-91f7-dc9927cb3a36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" Apr 16 18:16:40.086254 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086221 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c7893c1b-506b-4195-91f7-dc9927cb3a36-registration-dir\") pod \"aws-ebs-csi-driver-node-6bmgn\" (UID: \"c7893c1b-506b-4195-91f7-dc9927cb3a36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" Apr 16 18:16:40.086254 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086246 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-log-socket\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.086473 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086257 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-var-lib-kubelet\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.086473 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086268 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-host-cni-netd\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.086473 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086276 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c7893c1b-506b-4195-91f7-dc9927cb3a36-socket-dir\") pod \"aws-ebs-csi-driver-node-6bmgn\" (UID: \"c7893c1b-506b-4195-91f7-dc9927cb3a36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" Apr 16 18:16:40.086473 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086309 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-host-var-lib-kubelet\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.086473 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086313 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-log-socket\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.086473 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086314 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c7893c1b-506b-4195-91f7-dc9927cb3a36-registration-dir\") pod \"aws-ebs-csi-driver-node-6bmgn\" (UID: \"c7893c1b-506b-4195-91f7-dc9927cb3a36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" Apr 16 18:16:40.086473 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086340 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-host-cni-netd\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.086473 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086364 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-host-var-lib-kubelet\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.086473 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086390 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c4168765-8ac0-4395-a56d-b2991fa122e3-multus-daemon-config\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.086473 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086418 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5146021c-a86d-4b5f-a47d-7f8c736f756e-host\") pod \"node-ca-d89lv\" (UID: \"5146021c-a86d-4b5f-a47d-7f8c736f756e\") " pod="openshift-image-registry/node-ca-d89lv" Apr 16 18:16:40.086473 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086449 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w6q8\" (UniqueName: \"kubernetes.io/projected/e39063f8-caec-45bd-ae4f-e11765edec8b-kube-api-access-4w6q8\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.086473 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086479 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-run-systemd\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.086997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086511 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5146021c-a86d-4b5f-a47d-7f8c736f756e-host\") pod \"node-ca-d89lv\" (UID: \"5146021c-a86d-4b5f-a47d-7f8c736f756e\") " pod="openshift-image-registry/node-ca-d89lv" Apr 16 18:16:40.086997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086511 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-run-openvswitch\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.086997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086555 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-run-systemd\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.086997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086561 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/99eb116c-fa90-4ac9-a593-ec208e5f2f43-agent-certs\") pod \"konnectivity-agent-c6ncp\" (UID: \"99eb116c-fa90-4ac9-a593-ec208e5f2f43\") " pod="kube-system/konnectivity-agent-c6ncp" Apr 16 18:16:40.086997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086566 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-run-openvswitch\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.086997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086586 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clznr\" (UniqueName: \"kubernetes.io/projected/c7893c1b-506b-4195-91f7-dc9927cb3a36-kube-api-access-clznr\") pod \"aws-ebs-csi-driver-node-6bmgn\" (UID: \"c7893c1b-506b-4195-91f7-dc9927cb3a36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" Apr 16 18:16:40.086997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086604 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-host-var-lib-cni-bin\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.086997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086619 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-etc-kubernetes\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.086997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086632 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs\") pod \"network-metrics-daemon-sw2bl\" (UID: \"a7930c14-4ef0-4949-a2ae-9a240da66c3c\") " pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:16:40.086997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086646 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/24c29d62-5808-4ee7-92cd-ce4e68faf741-host-slash\") pod \"iptables-alerter-t6c2c\" (UID: \"24c29d62-5808-4ee7-92cd-ce4e68faf741\") " pod="openshift-network-operator/iptables-alerter-t6c2c" Apr 16 18:16:40.086997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086671 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e39063f8-caec-45bd-ae4f-e11765edec8b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.086997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086683 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-etc-kubernetes\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.086997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086698 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-systemd-units\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.086997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086721 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-host-run-netns\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.086997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086742 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-ovnkube-config\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.086997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086752 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/24c29d62-5808-4ee7-92cd-ce4e68faf741-host-slash\") pod \"iptables-alerter-t6c2c\" (UID: \"24c29d62-5808-4ee7-92cd-ce4e68faf741\") " pod="openshift-network-operator/iptables-alerter-t6c2c" Apr 16 18:16:40.086997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086765 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-etc-modprobe-d\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.086997 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:40.086768 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:40.087859 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086802 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-sys\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.087859 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086811 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-systemd-units\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.087859 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086832 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:16:40.087859 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:40.086846 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs podName:a7930c14-4ef0-4949-a2ae-9a240da66c3c nodeName:}" failed. No retries permitted until 2026-04-16 18:16:40.586818047 +0000 UTC m=+2.094083157 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs") pod "network-metrics-daemon-sw2bl" (UID: "a7930c14-4ef0-4949-a2ae-9a240da66c3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:40.087859 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086852 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c4168765-8ac0-4395-a56d-b2991fa122e3-multus-daemon-config\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.087859 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086864 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-etc-modprobe-d\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.087859 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086868 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e39063f8-caec-45bd-ae4f-e11765edec8b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.087859 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086855 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-sys\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.087859 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086897 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-host-run-netns\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.087859 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086906 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-node-log\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.087859 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086932 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-hostroot\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.087859 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086940 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-node-log\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.087859 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086945 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-host-var-lib-cni-bin\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.087859 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086956 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fwd4\" (UniqueName: \"kubernetes.io/projected/c4168765-8ac0-4395-a56d-b2991fa122e3-kube-api-access-9fwd4\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.087859 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086968 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-hostroot\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.087859 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.086980 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-etc-sysconfig\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.087859 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087005 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-etc-sysctl-d\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.088667 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087022 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-etc-sysctl-conf\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.088667 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087036 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-etc-tuned\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.088667 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087042 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-etc-sysconfig\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.088667 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087055 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e39063f8-caec-45bd-ae4f-e11765edec8b-system-cni-dir\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.088667 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087082 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-ovn-node-metrics-cert\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.088667 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087106 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/18fc33d6-c4dd-487b-8457-811880ffd3ea-hosts-file\") pod \"node-resolver-ll55v\" (UID: \"18fc33d6-c4dd-487b-8457-811880ffd3ea\") " pod="openshift-dns/node-resolver-ll55v" Apr 16 18:16:40.088667 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087131 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-multus-socket-dir-parent\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.088667 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087150 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-etc-sysctl-conf\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.088667 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087154 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgmln\" (UniqueName: \"kubernetes.io/projected/2571d812-7882-455b-be2f-4e3888df0e6a-kube-api-access-wgmln\") pod \"network-check-target-jgrxb\" (UID: \"2571d812-7882-455b-be2f-4e3888df0e6a\") " pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:16:40.088667 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087185 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-etc-sysctl-d\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.088667 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087192 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/24c29d62-5808-4ee7-92cd-ce4e68faf741-iptables-alerter-script\") pod \"iptables-alerter-t6c2c\" (UID: \"24c29d62-5808-4ee7-92cd-ce4e68faf741\") " pod="openshift-network-operator/iptables-alerter-t6c2c" Apr 16 18:16:40.088667 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087260 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/18fc33d6-c4dd-487b-8457-811880ffd3ea-hosts-file\") pod \"node-resolver-ll55v\" (UID: \"18fc33d6-c4dd-487b-8457-811880ffd3ea\") " pod="openshift-dns/node-resolver-ll55v" Apr 16 18:16:40.088667 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087292 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-host-run-multus-certs\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.088667 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087308 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-ovnkube-config\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.088667 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087337 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72fn5\" (UniqueName: \"kubernetes.io/projected/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-kube-api-access-72fn5\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.088667 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087363 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-var-lib-openvswitch\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.088667 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087366 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-host-run-multus-certs\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.089372 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087385 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-multus-socket-dir-parent\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.089372 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087405 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/18fc33d6-c4dd-487b-8457-811880ffd3ea-tmp-dir\") pod \"node-resolver-ll55v\" (UID: \"18fc33d6-c4dd-487b-8457-811880ffd3ea\") " pod="openshift-dns/node-resolver-ll55v" Apr 16 18:16:40.089372 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087436 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-var-lib-openvswitch\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.089372 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087447 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-host-run-netns\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.089372 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087492 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e39063f8-caec-45bd-ae4f-e11765edec8b-os-release\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.089372 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087513 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-host-run-netns\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.089372 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087520 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-env-overrides\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.089372 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087567 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e39063f8-caec-45bd-ae4f-e11765edec8b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.089372 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087679 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-cnibin\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.089372 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087706 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5146021c-a86d-4b5f-a47d-7f8c736f756e-serviceca\") pod \"node-ca-d89lv\" (UID: \"5146021c-a86d-4b5f-a47d-7f8c736f756e\") " pod="openshift-image-registry/node-ca-d89lv" Apr 16 18:16:40.089372 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087711 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/24c29d62-5808-4ee7-92cd-ce4e68faf741-iptables-alerter-script\") pod \"iptables-alerter-t6c2c\" (UID: \"24c29d62-5808-4ee7-92cd-ce4e68faf741\") " pod="openshift-network-operator/iptables-alerter-t6c2c" Apr 16 18:16:40.089372 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087720 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/18fc33d6-c4dd-487b-8457-811880ffd3ea-tmp-dir\") pod \"node-resolver-ll55v\" (UID: \"18fc33d6-c4dd-487b-8457-811880ffd3ea\") " pod="openshift-dns/node-resolver-ll55v" Apr 16 18:16:40.089372 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087728 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-host\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.089372 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087752 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7893c1b-506b-4195-91f7-dc9927cb3a36-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6bmgn\" (UID: \"c7893c1b-506b-4195-91f7-dc9927cb3a36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" Apr 16 18:16:40.089372 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087777 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c7893c1b-506b-4195-91f7-dc9927cb3a36-sys-fs\") pod \"aws-ebs-csi-driver-node-6bmgn\" (UID: \"c7893c1b-506b-4195-91f7-dc9927cb3a36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" Apr 16 18:16:40.089372 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087784 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-cnibin\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.089372 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087797 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-host\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.089372 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087834 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7893c1b-506b-4195-91f7-dc9927cb3a36-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6bmgn\" (UID: \"c7893c1b-506b-4195-91f7-dc9927cb3a36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" Apr 16 18:16:40.089941 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087801 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-run-ovn\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.089941 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087857 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-run-ovn\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.089941 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087877 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.089941 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087916 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.089941 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087928 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-host-run-k8s-cni-cncf-io\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.089941 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087882 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c7893c1b-506b-4195-91f7-dc9927cb3a36-sys-fs\") pod \"aws-ebs-csi-driver-node-6bmgn\" (UID: \"c7893c1b-506b-4195-91f7-dc9927cb3a36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" Apr 16 18:16:40.089941 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087964 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-multus-conf-dir\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.089941 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087978 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-host-run-k8s-cni-cncf-io\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.089941 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.087989 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-run\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.089941 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088012 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c7893c1b-506b-4195-91f7-dc9927cb3a36-device-dir\") pod \"aws-ebs-csi-driver-node-6bmgn\" (UID: \"c7893c1b-506b-4195-91f7-dc9927cb3a36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" Apr 16 18:16:40.089941 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088019 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-multus-conf-dir\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.089941 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088035 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c7893c1b-506b-4195-91f7-dc9927cb3a36-etc-selinux\") pod \"aws-ebs-csi-driver-node-6bmgn\" (UID: \"c7893c1b-506b-4195-91f7-dc9927cb3a36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" Apr 16 18:16:40.089941 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088059 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/99eb116c-fa90-4ac9-a593-ec208e5f2f43-konnectivity-ca\") pod \"konnectivity-agent-c6ncp\" (UID: \"99eb116c-fa90-4ac9-a593-ec208e5f2f43\") " pod="kube-system/konnectivity-agent-c6ncp" Apr 16 18:16:40.089941 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088062 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c7893c1b-506b-4195-91f7-dc9927cb3a36-device-dir\") pod \"aws-ebs-csi-driver-node-6bmgn\" (UID: \"c7893c1b-506b-4195-91f7-dc9927cb3a36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" Apr 16 18:16:40.089941 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088037 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-run\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.089941 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088077 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5146021c-a86d-4b5f-a47d-7f8c736f756e-serviceca\") pod \"node-ca-d89lv\" (UID: \"5146021c-a86d-4b5f-a47d-7f8c736f756e\") " pod="openshift-image-registry/node-ca-d89lv" Apr 16 18:16:40.089941 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088083 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xt8nk\" (UniqueName: \"kubernetes.io/projected/18fc33d6-c4dd-487b-8457-811880ffd3ea-kube-api-access-xt8nk\") pod \"node-resolver-ll55v\" (UID: \"18fc33d6-c4dd-487b-8457-811880ffd3ea\") " pod="openshift-dns/node-resolver-ll55v" Apr 16 18:16:40.090596 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088129 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c4168765-8ac0-4395-a56d-b2991fa122e3-cni-binary-copy\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.090596 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088136 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c7893c1b-506b-4195-91f7-dc9927cb3a36-etc-selinux\") pod \"aws-ebs-csi-driver-node-6bmgn\" (UID: \"c7893c1b-506b-4195-91f7-dc9927cb3a36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" Apr 16 18:16:40.090596 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088159 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hv4l5\" (UniqueName: \"kubernetes.io/projected/a7930c14-4ef0-4949-a2ae-9a240da66c3c-kube-api-access-hv4l5\") pod \"network-metrics-daemon-sw2bl\" (UID: \"a7930c14-4ef0-4949-a2ae-9a240da66c3c\") " pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:16:40.090596 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088184 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-etc-systemd\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.090596 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088210 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e39063f8-caec-45bd-ae4f-e11765edec8b-cni-binary-copy\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.090596 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088242 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-system-cni-dir\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.090596 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088268 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-multus-cni-dir\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.090596 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088292 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-host-var-lib-cni-multus\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.090596 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088296 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-etc-systemd\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.090596 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088314 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5df74\" (UniqueName: \"kubernetes.io/projected/5146021c-a86d-4b5f-a47d-7f8c736f756e-kube-api-access-5df74\") pod \"node-ca-d89lv\" (UID: \"5146021c-a86d-4b5f-a47d-7f8c736f756e\") " pod="openshift-image-registry/node-ca-d89lv" Apr 16 18:16:40.090596 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088335 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-tmp\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.090596 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088358 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e39063f8-caec-45bd-ae4f-e11765edec8b-cnibin\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.090596 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088384 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-host-kubelet\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.090596 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088407 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-host-run-ovn-kubernetes\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.090596 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088431 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-ovnkube-script-lib\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.090596 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088454 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-os-release\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.090596 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088523 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-os-release\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.091024 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088633 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/99eb116c-fa90-4ac9-a593-ec208e5f2f43-konnectivity-ca\") pod \"konnectivity-agent-c6ncp\" (UID: \"99eb116c-fa90-4ac9-a593-ec208e5f2f43\") " pod="kube-system/konnectivity-agent-c6ncp" Apr 16 18:16:40.091024 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088669 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c4168765-8ac0-4395-a56d-b2991fa122e3-cni-binary-copy\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.091024 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088688 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-host-var-lib-cni-multus\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.091024 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088676 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-system-cni-dir\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.091024 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088697 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-lib-modules\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.091024 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088728 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5j7f6\" (UniqueName: \"kubernetes.io/projected/24c29d62-5808-4ee7-92cd-ce4e68faf741-kube-api-access-5j7f6\") pod \"iptables-alerter-t6c2c\" (UID: \"24c29d62-5808-4ee7-92cd-ce4e68faf741\") " pod="openshift-network-operator/iptables-alerter-t6c2c" Apr 16 18:16:40.091024 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088737 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-host-kubelet\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.091024 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088744 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4168765-8ac0-4395-a56d-b2991fa122e3-multus-cni-dir\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.091024 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088793 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-host-run-ovn-kubernetes\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.091024 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.088838 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-lib-modules\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.091024 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.089055 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-env-overrides\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.091024 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.089208 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-ovnkube-script-lib\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.091024 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.089847 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-etc-tuned\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.091024 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.090144 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/99eb116c-fa90-4ac9-a593-ec208e5f2f43-agent-certs\") pod \"konnectivity-agent-c6ncp\" (UID: \"99eb116c-fa90-4ac9-a593-ec208e5f2f43\") " pod="kube-system/konnectivity-agent-c6ncp" Apr 16 18:16:40.091024 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.090449 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-tmp\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.091024 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.090618 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-ovn-node-metrics-cert\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.097357 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.097340 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzmjc\" (UniqueName: \"kubernetes.io/projected/7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3-kube-api-access-jzmjc\") pod \"ovnkube-node-tj7xv\" (UID: \"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.098780 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.098752 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5df74\" (UniqueName: \"kubernetes.io/projected/5146021c-a86d-4b5f-a47d-7f8c736f756e-kube-api-access-5df74\") pod \"node-ca-d89lv\" (UID: \"5146021c-a86d-4b5f-a47d-7f8c736f756e\") " pod="openshift-image-registry/node-ca-d89lv" Apr 16 18:16:40.099352 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:40.099317 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:16:40.099352 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:40.099339 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:16:40.099499 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:40.099366 2573 projected.go:194] Error preparing data for projected volume kube-api-access-wgmln for pod openshift-network-diagnostics/network-check-target-jgrxb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:40.099499 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:40.099431 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2571d812-7882-455b-be2f-4e3888df0e6a-kube-api-access-wgmln podName:2571d812-7882-455b-be2f-4e3888df0e6a nodeName:}" failed. No retries permitted until 2026-04-16 18:16:40.599415031 +0000 UTC m=+2.106680133 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wgmln" (UniqueName: "kubernetes.io/projected/2571d812-7882-455b-be2f-4e3888df0e6a-kube-api-access-wgmln") pod "network-check-target-jgrxb" (UID: "2571d812-7882-455b-be2f-4e3888df0e6a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:40.099682 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.099580 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clznr\" (UniqueName: \"kubernetes.io/projected/c7893c1b-506b-4195-91f7-dc9927cb3a36-kube-api-access-clznr\") pod \"aws-ebs-csi-driver-node-6bmgn\" (UID: \"c7893c1b-506b-4195-91f7-dc9927cb3a36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" Apr 16 18:16:40.099880 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.099865 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j7f6\" (UniqueName: \"kubernetes.io/projected/24c29d62-5808-4ee7-92cd-ce4e68faf741-kube-api-access-5j7f6\") pod \"iptables-alerter-t6c2c\" (UID: \"24c29d62-5808-4ee7-92cd-ce4e68faf741\") " pod="openshift-network-operator/iptables-alerter-t6c2c" Apr 16 18:16:40.099976 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.099957 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fwd4\" (UniqueName: \"kubernetes.io/projected/c4168765-8ac0-4395-a56d-b2991fa122e3-kube-api-access-9fwd4\") pod \"multus-kq8d4\" (UID: \"c4168765-8ac0-4395-a56d-b2991fa122e3\") " pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.100830 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.100813 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72fn5\" (UniqueName: \"kubernetes.io/projected/23e0fe0d-a889-48ef-942d-1a02dac9ac5e-kube-api-access-72fn5\") pod \"tuned-bsv9k\" (UID: \"23e0fe0d-a889-48ef-942d-1a02dac9ac5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.101135 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.101113 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt8nk\" (UniqueName: \"kubernetes.io/projected/18fc33d6-c4dd-487b-8457-811880ffd3ea-kube-api-access-xt8nk\") pod \"node-resolver-ll55v\" (UID: \"18fc33d6-c4dd-487b-8457-811880ffd3ea\") " pod="openshift-dns/node-resolver-ll55v" Apr 16 18:16:40.101342 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.101326 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv4l5\" (UniqueName: \"kubernetes.io/projected/a7930c14-4ef0-4949-a2ae-9a240da66c3c-kube-api-access-hv4l5\") pod \"network-metrics-daemon-sw2bl\" (UID: \"a7930c14-4ef0-4949-a2ae-9a240da66c3c\") " pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:16:40.119314 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.119278 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-203.ec2.internal" event={"ID":"cad5fa07f782ca2be3a004d69f182f5f","Type":"ContainerStarted","Data":"b8459c8d98d2773426774db322a439bb6bb69b62102ef1c43e781beeb5833aa6"} Apr 16 18:16:40.120105 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.120080 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-203.ec2.internal" event={"ID":"0e77e2e37a581a4f0ed7090fbdb2ba3d","Type":"ContainerStarted","Data":"f5331adeed5e45de8e87cc8b9d45681788b4d1a250ee41f69f0dc25709fdaef7"} Apr 16 18:16:40.148664 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.148641 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:16:40.189831 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.189804 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e39063f8-caec-45bd-ae4f-e11765edec8b-cni-binary-copy\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.189933 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.189834 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e39063f8-caec-45bd-ae4f-e11765edec8b-cnibin\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.189933 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.189855 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4w6q8\" (UniqueName: \"kubernetes.io/projected/e39063f8-caec-45bd-ae4f-e11765edec8b-kube-api-access-4w6q8\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.189933 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.189899 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e39063f8-caec-45bd-ae4f-e11765edec8b-cnibin\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.189933 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.189931 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e39063f8-caec-45bd-ae4f-e11765edec8b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.190144 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.189951 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e39063f8-caec-45bd-ae4f-e11765edec8b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.190144 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.189969 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e39063f8-caec-45bd-ae4f-e11765edec8b-system-cni-dir\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.190144 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.190005 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e39063f8-caec-45bd-ae4f-e11765edec8b-system-cni-dir\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.190144 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.190088 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e39063f8-caec-45bd-ae4f-e11765edec8b-os-release\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.190144 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.190110 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e39063f8-caec-45bd-ae4f-e11765edec8b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.190310 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.190196 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e39063f8-caec-45bd-ae4f-e11765edec8b-os-release\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.190379 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.190363 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e39063f8-caec-45bd-ae4f-e11765edec8b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.190413 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.190397 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e39063f8-caec-45bd-ae4f-e11765edec8b-cni-binary-copy\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.190449 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.190434 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e39063f8-caec-45bd-ae4f-e11765edec8b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.190788 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.190771 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e39063f8-caec-45bd-ae4f-e11765edec8b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.203460 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.203405 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w6q8\" (UniqueName: \"kubernetes.io/projected/e39063f8-caec-45bd-ae4f-e11765edec8b-kube-api-access-4w6q8\") pod \"multus-additional-cni-plugins-lp5l8\" (UID: \"e39063f8-caec-45bd-ae4f-e11765edec8b\") " pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.293779 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.293754 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kq8d4" Apr 16 18:16:40.298253 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.298228 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" Apr 16 18:16:40.299736 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:40.299713 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4168765_8ac0_4395_a56d_b2991fa122e3.slice/crio-ce8e3ed10f8048153391e903ebd6f7367fd887fdc35dd817e8e976e8f5a2b37b WatchSource:0}: Error finding container ce8e3ed10f8048153391e903ebd6f7367fd887fdc35dd817e8e976e8f5a2b37b: Status 404 returned error can't find the container with id ce8e3ed10f8048153391e903ebd6f7367fd887fdc35dd817e8e976e8f5a2b37b Apr 16 18:16:40.304059 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:40.304038 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7893c1b_506b_4195_91f7_dc9927cb3a36.slice/crio-4c11e83411b818a0ce8ab964da3b11bc77c87a49a6c11a672c3bc1c9d8486bae WatchSource:0}: Error finding container 4c11e83411b818a0ce8ab964da3b11bc77c87a49a6c11a672c3bc1c9d8486bae: Status 404 returned error can't find the container with id 4c11e83411b818a0ce8ab964da3b11bc77c87a49a6c11a672c3bc1c9d8486bae Apr 16 18:16:40.317573 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.317537 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d89lv" Apr 16 18:16:40.323701 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:40.323681 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5146021c_a86d_4b5f_a47d_7f8c736f756e.slice/crio-bc53752b2ec7ebc21acab9045cee4d96acd8688edde699c042b9f7d8daecede8 WatchSource:0}: Error finding container bc53752b2ec7ebc21acab9045cee4d96acd8688edde699c042b9f7d8daecede8: Status 404 returned error can't find the container with id bc53752b2ec7ebc21acab9045cee4d96acd8688edde699c042b9f7d8daecede8 Apr 16 18:16:40.348105 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.348074 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-t6c2c" Apr 16 18:16:40.351750 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.351725 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:16:40.354287 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:40.354263 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24c29d62_5808_4ee7_92cd_ce4e68faf741.slice/crio-0bc7f725a95b45f0c11ecd1a063ddbd0d7b2312a8951b627a7a7a0b6126082f4 WatchSource:0}: Error finding container 0bc7f725a95b45f0c11ecd1a063ddbd0d7b2312a8951b627a7a7a0b6126082f4: Status 404 returned error can't find the container with id 0bc7f725a95b45f0c11ecd1a063ddbd0d7b2312a8951b627a7a7a0b6126082f4 Apr 16 18:16:40.357838 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:40.357818 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e7d73ee_b24c_4ba6_94cc_4c6e3044a3f3.slice/crio-619668d9b94a9da5d8b1560cbd85bb4e11c10bfc7de603379d897012ce121935 WatchSource:0}: Error finding container 619668d9b94a9da5d8b1560cbd85bb4e11c10bfc7de603379d897012ce121935: Status 404 returned error can't find the container with id 619668d9b94a9da5d8b1560cbd85bb4e11c10bfc7de603379d897012ce121935 Apr 16 18:16:40.358361 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.358348 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-c6ncp" Apr 16 18:16:40.364145 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.364125 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" Apr 16 18:16:40.365338 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:40.365318 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99eb116c_fa90_4ac9_a593_ec208e5f2f43.slice/crio-865ca4d9db6fc325ac128b14e7050280ad0f21e978af0c86baea8c953050732e WatchSource:0}: Error finding container 865ca4d9db6fc325ac128b14e7050280ad0f21e978af0c86baea8c953050732e: Status 404 returned error can't find the container with id 865ca4d9db6fc325ac128b14e7050280ad0f21e978af0c86baea8c953050732e Apr 16 18:16:40.369640 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:40.369617 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23e0fe0d_a889_48ef_942d_1a02dac9ac5e.slice/crio-748967d28fd83cccb260972545162a209f6c7aa6e0d81632f3389bb6da4d4585 WatchSource:0}: Error finding container 748967d28fd83cccb260972545162a209f6c7aa6e0d81632f3389bb6da4d4585: Status 404 returned error can't find the container with id 748967d28fd83cccb260972545162a209f6c7aa6e0d81632f3389bb6da4d4585 Apr 16 18:16:40.373013 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.372996 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ll55v" Apr 16 18:16:40.375996 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.375978 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lp5l8" Apr 16 18:16:40.379861 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:40.379836 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18fc33d6_c4dd_487b_8457_811880ffd3ea.slice/crio-4b2e08ef4faa74bbcd5b280b7a390449dcc768e53ae2bb21db552c77f0703562 WatchSource:0}: Error finding container 4b2e08ef4faa74bbcd5b280b7a390449dcc768e53ae2bb21db552c77f0703562: Status 404 returned error can't find the container with id 4b2e08ef4faa74bbcd5b280b7a390449dcc768e53ae2bb21db552c77f0703562 Apr 16 18:16:40.385154 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:16:40.385132 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode39063f8_caec_45bd_ae4f_e11765edec8b.slice/crio-76d0c5242543c652d2a88e82d0f99fac3934599e680b82438152e42f632ef266 WatchSource:0}: Error finding container 76d0c5242543c652d2a88e82d0f99fac3934599e680b82438152e42f632ef266: Status 404 returned error can't find the container with id 76d0c5242543c652d2a88e82d0f99fac3934599e680b82438152e42f632ef266 Apr 16 18:16:40.594086 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.593999 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs\") pod \"network-metrics-daemon-sw2bl\" (UID: \"a7930c14-4ef0-4949-a2ae-9a240da66c3c\") " pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:16:40.594240 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:40.594136 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:40.594240 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:40.594199 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs podName:a7930c14-4ef0-4949-a2ae-9a240da66c3c nodeName:}" failed. No retries permitted until 2026-04-16 18:16:41.59418068 +0000 UTC m=+3.101445779 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs") pod "network-metrics-daemon-sw2bl" (UID: "a7930c14-4ef0-4949-a2ae-9a240da66c3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:40.694375 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:40.694328 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgmln\" (UniqueName: \"kubernetes.io/projected/2571d812-7882-455b-be2f-4e3888df0e6a-kube-api-access-wgmln\") pod \"network-check-target-jgrxb\" (UID: \"2571d812-7882-455b-be2f-4e3888df0e6a\") " pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:16:40.694569 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:40.694525 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:16:40.694569 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:40.694561 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:16:40.694702 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:40.694575 2573 projected.go:194] Error preparing data for projected volume kube-api-access-wgmln for pod openshift-network-diagnostics/network-check-target-jgrxb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:40.694702 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:40.694632 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2571d812-7882-455b-be2f-4e3888df0e6a-kube-api-access-wgmln podName:2571d812-7882-455b-be2f-4e3888df0e6a nodeName:}" failed. No retries permitted until 2026-04-16 18:16:41.694614025 +0000 UTC m=+3.201879126 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-wgmln" (UniqueName: "kubernetes.io/projected/2571d812-7882-455b-be2f-4e3888df0e6a-kube-api-access-wgmln") pod "network-check-target-jgrxb" (UID: "2571d812-7882-455b-be2f-4e3888df0e6a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:41.020668 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:41.020120 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:11:40 +0000 UTC" deadline="2027-10-22 19:45:10.621582257 +0000 UTC" Apr 16 18:16:41.020668 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:41.020154 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13297h28m29.601431591s" Apr 16 18:16:41.129240 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:41.129203 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ll55v" event={"ID":"18fc33d6-c4dd-487b-8457-811880ffd3ea","Type":"ContainerStarted","Data":"4b2e08ef4faa74bbcd5b280b7a390449dcc768e53ae2bb21db552c77f0703562"} Apr 16 18:16:41.141085 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:41.141046 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" event={"ID":"23e0fe0d-a889-48ef-942d-1a02dac9ac5e","Type":"ContainerStarted","Data":"748967d28fd83cccb260972545162a209f6c7aa6e0d81632f3389bb6da4d4585"} Apr 16 18:16:41.147816 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:41.147783 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-c6ncp" event={"ID":"99eb116c-fa90-4ac9-a593-ec208e5f2f43","Type":"ContainerStarted","Data":"865ca4d9db6fc325ac128b14e7050280ad0f21e978af0c86baea8c953050732e"} Apr 16 18:16:41.150160 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:41.150131 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" event={"ID":"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3","Type":"ContainerStarted","Data":"619668d9b94a9da5d8b1560cbd85bb4e11c10bfc7de603379d897012ce121935"} Apr 16 18:16:41.154145 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:41.154115 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-t6c2c" event={"ID":"24c29d62-5808-4ee7-92cd-ce4e68faf741","Type":"ContainerStarted","Data":"0bc7f725a95b45f0c11ecd1a063ddbd0d7b2312a8951b627a7a7a0b6126082f4"} Apr 16 18:16:41.169690 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:41.169370 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lp5l8" event={"ID":"e39063f8-caec-45bd-ae4f-e11765edec8b","Type":"ContainerStarted","Data":"76d0c5242543c652d2a88e82d0f99fac3934599e680b82438152e42f632ef266"} Apr 16 18:16:41.170606 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:41.170584 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:16:41.190707 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:41.190643 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d89lv" event={"ID":"5146021c-a86d-4b5f-a47d-7f8c736f756e","Type":"ContainerStarted","Data":"bc53752b2ec7ebc21acab9045cee4d96acd8688edde699c042b9f7d8daecede8"} Apr 16 18:16:41.193767 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:41.193726 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" event={"ID":"c7893c1b-506b-4195-91f7-dc9927cb3a36","Type":"ContainerStarted","Data":"4c11e83411b818a0ce8ab964da3b11bc77c87a49a6c11a672c3bc1c9d8486bae"} Apr 16 18:16:41.201213 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:41.201161 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kq8d4" event={"ID":"c4168765-8ac0-4395-a56d-b2991fa122e3","Type":"ContainerStarted","Data":"ce8e3ed10f8048153391e903ebd6f7367fd887fdc35dd817e8e976e8f5a2b37b"} Apr 16 18:16:41.320202 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:41.320125 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:16:41.602090 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:41.602012 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs\") pod \"network-metrics-daemon-sw2bl\" (UID: \"a7930c14-4ef0-4949-a2ae-9a240da66c3c\") " pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:16:41.602310 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:41.602288 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:41.602373 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:41.602364 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs podName:a7930c14-4ef0-4949-a2ae-9a240da66c3c nodeName:}" failed. No retries permitted until 2026-04-16 18:16:43.602344136 +0000 UTC m=+5.109609256 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs") pod "network-metrics-daemon-sw2bl" (UID: "a7930c14-4ef0-4949-a2ae-9a240da66c3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:41.703020 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:41.702984 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgmln\" (UniqueName: \"kubernetes.io/projected/2571d812-7882-455b-be2f-4e3888df0e6a-kube-api-access-wgmln\") pod \"network-check-target-jgrxb\" (UID: \"2571d812-7882-455b-be2f-4e3888df0e6a\") " pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:16:41.703208 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:41.703121 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:16:41.703208 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:41.703141 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:16:41.703208 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:41.703152 2573 projected.go:194] Error preparing data for projected volume kube-api-access-wgmln for pod openshift-network-diagnostics/network-check-target-jgrxb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:41.703356 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:41.703211 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2571d812-7882-455b-be2f-4e3888df0e6a-kube-api-access-wgmln podName:2571d812-7882-455b-be2f-4e3888df0e6a nodeName:}" failed. No retries permitted until 2026-04-16 18:16:43.703190067 +0000 UTC m=+5.210455164 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-wgmln" (UniqueName: "kubernetes.io/projected/2571d812-7882-455b-be2f-4e3888df0e6a-kube-api-access-wgmln") pod "network-check-target-jgrxb" (UID: "2571d812-7882-455b-be2f-4e3888df0e6a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:42.020893 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:42.020789 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:11:40 +0000 UTC" deadline="2028-01-09 04:08:21.479987338 +0000 UTC" Apr 16 18:16:42.020893 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:42.020846 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15177h51m39.459145129s" Apr 16 18:16:42.117132 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:42.117103 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:16:42.117323 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:42.117235 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sw2bl" podUID="a7930c14-4ef0-4949-a2ae-9a240da66c3c" Apr 16 18:16:42.117385 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:42.117344 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:16:42.117439 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:42.117415 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgrxb" podUID="2571d812-7882-455b-be2f-4e3888df0e6a" Apr 16 18:16:42.133244 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:42.131769 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:16:43.617802 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:43.617755 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs\") pod \"network-metrics-daemon-sw2bl\" (UID: \"a7930c14-4ef0-4949-a2ae-9a240da66c3c\") " pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:16:43.618272 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:43.617959 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:43.618272 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:43.618028 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs podName:a7930c14-4ef0-4949-a2ae-9a240da66c3c nodeName:}" failed. No retries permitted until 2026-04-16 18:16:47.618007149 +0000 UTC m=+9.125272266 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs") pod "network-metrics-daemon-sw2bl" (UID: "a7930c14-4ef0-4949-a2ae-9a240da66c3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:43.718884 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:43.718845 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgmln\" (UniqueName: \"kubernetes.io/projected/2571d812-7882-455b-be2f-4e3888df0e6a-kube-api-access-wgmln\") pod \"network-check-target-jgrxb\" (UID: \"2571d812-7882-455b-be2f-4e3888df0e6a\") " pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:16:43.719073 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:43.719007 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:16:43.719073 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:43.719026 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:16:43.719073 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:43.719040 2573 projected.go:194] Error preparing data for projected volume kube-api-access-wgmln for pod openshift-network-diagnostics/network-check-target-jgrxb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:43.719230 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:43.719103 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2571d812-7882-455b-be2f-4e3888df0e6a-kube-api-access-wgmln podName:2571d812-7882-455b-be2f-4e3888df0e6a nodeName:}" failed. No retries permitted until 2026-04-16 18:16:47.719085946 +0000 UTC m=+9.226351058 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-wgmln" (UniqueName: "kubernetes.io/projected/2571d812-7882-455b-be2f-4e3888df0e6a-kube-api-access-wgmln") pod "network-check-target-jgrxb" (UID: "2571d812-7882-455b-be2f-4e3888df0e6a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:44.116775 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:44.116658 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:16:44.116918 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:44.116789 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgrxb" podUID="2571d812-7882-455b-be2f-4e3888df0e6a" Apr 16 18:16:44.117193 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:44.117173 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:16:44.117857 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:44.117759 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sw2bl" podUID="a7930c14-4ef0-4949-a2ae-9a240da66c3c" Apr 16 18:16:46.117604 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:46.117573 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:16:46.118048 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:46.117696 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgrxb" podUID="2571d812-7882-455b-be2f-4e3888df0e6a" Apr 16 18:16:46.118048 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:46.117573 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:16:46.118048 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:46.117863 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sw2bl" podUID="a7930c14-4ef0-4949-a2ae-9a240da66c3c" Apr 16 18:16:47.649236 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:47.649197 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs\") pod \"network-metrics-daemon-sw2bl\" (UID: \"a7930c14-4ef0-4949-a2ae-9a240da66c3c\") " pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:16:47.649711 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:47.649351 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:47.649711 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:47.649417 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs podName:a7930c14-4ef0-4949-a2ae-9a240da66c3c nodeName:}" failed. No retries permitted until 2026-04-16 18:16:55.649397986 +0000 UTC m=+17.156663083 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs") pod "network-metrics-daemon-sw2bl" (UID: "a7930c14-4ef0-4949-a2ae-9a240da66c3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:47.750458 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:47.750161 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgmln\" (UniqueName: \"kubernetes.io/projected/2571d812-7882-455b-be2f-4e3888df0e6a-kube-api-access-wgmln\") pod \"network-check-target-jgrxb\" (UID: \"2571d812-7882-455b-be2f-4e3888df0e6a\") " pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:16:47.750458 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:47.750303 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:16:47.750458 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:47.750325 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:16:47.750458 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:47.750338 2573 projected.go:194] Error preparing data for projected volume kube-api-access-wgmln for pod openshift-network-diagnostics/network-check-target-jgrxb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:47.750458 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:47.750402 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2571d812-7882-455b-be2f-4e3888df0e6a-kube-api-access-wgmln podName:2571d812-7882-455b-be2f-4e3888df0e6a nodeName:}" failed. No retries permitted until 2026-04-16 18:16:55.750382359 +0000 UTC m=+17.257647473 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-wgmln" (UniqueName: "kubernetes.io/projected/2571d812-7882-455b-be2f-4e3888df0e6a-kube-api-access-wgmln") pod "network-check-target-jgrxb" (UID: "2571d812-7882-455b-be2f-4e3888df0e6a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:48.117695 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:48.117607 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:16:48.117849 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:48.117607 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:16:48.117849 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:48.117746 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sw2bl" podUID="a7930c14-4ef0-4949-a2ae-9a240da66c3c" Apr 16 18:16:48.117849 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:48.117784 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgrxb" podUID="2571d812-7882-455b-be2f-4e3888df0e6a" Apr 16 18:16:50.117369 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:50.117331 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:16:50.117870 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:50.117334 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:16:50.117870 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:50.117470 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgrxb" podUID="2571d812-7882-455b-be2f-4e3888df0e6a" Apr 16 18:16:50.117870 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:50.117580 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sw2bl" podUID="a7930c14-4ef0-4949-a2ae-9a240da66c3c" Apr 16 18:16:52.117208 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:52.117168 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:16:52.117706 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:52.117294 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgrxb" podUID="2571d812-7882-455b-be2f-4e3888df0e6a" Apr 16 18:16:52.117706 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:52.117396 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:16:52.117706 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:52.117520 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sw2bl" podUID="a7930c14-4ef0-4949-a2ae-9a240da66c3c" Apr 16 18:16:54.117303 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:54.117260 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:16:54.117785 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:54.117262 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:16:54.117785 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:54.117390 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgrxb" podUID="2571d812-7882-455b-be2f-4e3888df0e6a" Apr 16 18:16:54.117785 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:54.117472 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sw2bl" podUID="a7930c14-4ef0-4949-a2ae-9a240da66c3c" Apr 16 18:16:55.708066 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:55.708028 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs\") pod \"network-metrics-daemon-sw2bl\" (UID: \"a7930c14-4ef0-4949-a2ae-9a240da66c3c\") " pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:16:55.708568 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:55.708209 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:55.708568 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:55.708281 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs podName:a7930c14-4ef0-4949-a2ae-9a240da66c3c nodeName:}" failed. No retries permitted until 2026-04-16 18:17:11.708264401 +0000 UTC m=+33.215529496 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs") pod "network-metrics-daemon-sw2bl" (UID: "a7930c14-4ef0-4949-a2ae-9a240da66c3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:55.809107 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:55.809071 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgmln\" (UniqueName: \"kubernetes.io/projected/2571d812-7882-455b-be2f-4e3888df0e6a-kube-api-access-wgmln\") pod \"network-check-target-jgrxb\" (UID: \"2571d812-7882-455b-be2f-4e3888df0e6a\") " pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:16:55.809297 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:55.809216 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:16:55.809297 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:55.809236 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:16:55.809297 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:55.809256 2573 projected.go:194] Error preparing data for projected volume kube-api-access-wgmln for pod openshift-network-diagnostics/network-check-target-jgrxb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:55.809438 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:55.809314 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2571d812-7882-455b-be2f-4e3888df0e6a-kube-api-access-wgmln podName:2571d812-7882-455b-be2f-4e3888df0e6a nodeName:}" failed. No retries permitted until 2026-04-16 18:17:11.809295736 +0000 UTC m=+33.316560843 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-wgmln" (UniqueName: "kubernetes.io/projected/2571d812-7882-455b-be2f-4e3888df0e6a-kube-api-access-wgmln") pod "network-check-target-jgrxb" (UID: "2571d812-7882-455b-be2f-4e3888df0e6a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:56.116901 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:56.116826 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:16:56.117060 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:56.116826 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:16:56.117060 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:56.116953 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sw2bl" podUID="a7930c14-4ef0-4949-a2ae-9a240da66c3c" Apr 16 18:16:56.117060 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:56.117045 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgrxb" podUID="2571d812-7882-455b-be2f-4e3888df0e6a" Apr 16 18:16:58.117392 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:58.117314 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:16:58.117832 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:58.117423 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sw2bl" podUID="a7930c14-4ef0-4949-a2ae-9a240da66c3c" Apr 16 18:16:58.117832 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:58.117510 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:16:58.117832 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:16:58.117633 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgrxb" podUID="2571d812-7882-455b-be2f-4e3888df0e6a" Apr 16 18:16:58.235018 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:58.234786 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" event={"ID":"23e0fe0d-a889-48ef-942d-1a02dac9ac5e","Type":"ContainerStarted","Data":"378aeb30630565f7778d5edd1ad094c42282eef916c718e0947ae7c98993b1bf"} Apr 16 18:16:58.236251 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:58.236229 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-203.ec2.internal" event={"ID":"cad5fa07f782ca2be3a004d69f182f5f","Type":"ContainerStarted","Data":"83b63964826e5a2cb009b24a13fd5dd78032023e307a964a94e3c0eda324db02"} Apr 16 18:16:58.237788 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:58.237764 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kq8d4" event={"ID":"c4168765-8ac0-4395-a56d-b2991fa122e3","Type":"ContainerStarted","Data":"8376017a6c962e63e8cc3daf68a2e9d2e19dab6e205af818eb65ae5797dc56be"} Apr 16 18:16:58.252850 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:58.252812 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bsv9k" podStartSLOduration=1.826657371 podStartE2EDuration="19.252798998s" podCreationTimestamp="2026-04-16 18:16:39 +0000 UTC" firstStartedPulling="2026-04-16 18:16:40.370977903 +0000 UTC m=+1.878242997" lastFinishedPulling="2026-04-16 18:16:57.797119526 +0000 UTC m=+19.304384624" observedRunningTime="2026-04-16 18:16:58.252421088 +0000 UTC m=+19.759686203" watchObservedRunningTime="2026-04-16 18:16:58.252798998 +0000 UTC m=+19.760064113" Apr 16 18:16:58.267906 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:58.267851 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-203.ec2.internal" podStartSLOduration=19.26783394 podStartE2EDuration="19.26783394s" podCreationTimestamp="2026-04-16 18:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:16:58.267782432 +0000 UTC m=+19.775047548" watchObservedRunningTime="2026-04-16 18:16:58.26783394 +0000 UTC m=+19.775099055" Apr 16 18:16:58.285831 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:58.285791 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kq8d4" podStartSLOduration=1.724341216 podStartE2EDuration="19.285774277s" podCreationTimestamp="2026-04-16 18:16:39 +0000 UTC" firstStartedPulling="2026-04-16 18:16:40.301363267 +0000 UTC m=+1.808628362" lastFinishedPulling="2026-04-16 18:16:57.862796325 +0000 UTC m=+19.370061423" observedRunningTime="2026-04-16 18:16:58.285341808 +0000 UTC m=+19.792606925" watchObservedRunningTime="2026-04-16 18:16:58.285774277 +0000 UTC m=+19.793039392" Apr 16 18:16:59.240772 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:59.240733 2573 generic.go:358] "Generic (PLEG): container finished" podID="0e77e2e37a581a4f0ed7090fbdb2ba3d" containerID="f4288ff640d0a63ef19a24121f8a52797b401b369ed6eaf952fea8b113b010c6" exitCode=0 Apr 16 18:16:59.241578 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:59.240818 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-203.ec2.internal" event={"ID":"0e77e2e37a581a4f0ed7090fbdb2ba3d","Type":"ContainerDied","Data":"f4288ff640d0a63ef19a24121f8a52797b401b369ed6eaf952fea8b113b010c6"} Apr 16 18:16:59.242059 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:59.242039 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ll55v" event={"ID":"18fc33d6-c4dd-487b-8457-811880ffd3ea","Type":"ContainerStarted","Data":"eb75450addb6e38bc0b2f404f40ad898c0b4183aae14ecc3b27bafc5289bf1f1"} Apr 16 18:16:59.243282 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:59.243261 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-c6ncp" event={"ID":"99eb116c-fa90-4ac9-a593-ec208e5f2f43","Type":"ContainerStarted","Data":"b1bd195c7c3d7bccf89569cb4c502eb18a4932d5905ca5e5db46e59777a301c7"} Apr 16 18:16:59.245861 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:59.245841 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" event={"ID":"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3","Type":"ContainerStarted","Data":"ba52c700cc1beb376f788a373797693fe3b240278e6082286e645113d68b8177"} Apr 16 18:16:59.245960 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:59.245868 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" event={"ID":"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3","Type":"ContainerStarted","Data":"bc74955014318c79726e1816deff374df8ee78efdbb56e3431ce37b46b791061"} Apr 16 18:16:59.245960 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:59.245881 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" event={"ID":"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3","Type":"ContainerStarted","Data":"0d3b79030e56a448c31f7697866734dd753f4535d5c2ddb306eb69c610489695"} Apr 16 18:16:59.245960 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:59.245894 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" event={"ID":"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3","Type":"ContainerStarted","Data":"086cbc4b65efd5d3c8fc1914a9f0f959fcf6870845a742c72fb7ca6aa4faff2c"} Apr 16 18:16:59.245960 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:59.245906 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" event={"ID":"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3","Type":"ContainerStarted","Data":"b1921d2cd41eb719e439bd4e6e9f07d5a95be1832a7d636edd627dcad81becee"} Apr 16 18:16:59.245960 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:59.245917 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" event={"ID":"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3","Type":"ContainerStarted","Data":"b53a606f16e152d3caf3bc6cc6c3dcc8890f6cc1080e8e6fdeb89dff37932ec1"} Apr 16 18:16:59.247265 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:59.247235 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-t6c2c" event={"ID":"24c29d62-5808-4ee7-92cd-ce4e68faf741","Type":"ContainerStarted","Data":"43239ec2739cc2e1dd9a5c6fb53d3c895ccf6f498a40fa3a1fee001c7f2019b7"} Apr 16 18:16:59.249048 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:59.249002 2573 generic.go:358] "Generic (PLEG): container finished" podID="e39063f8-caec-45bd-ae4f-e11765edec8b" containerID="b6beb9666fdfe8834eb35f932603d3016cb1150ad6a3141f57fbfc2b07a9a602" exitCode=0 Apr 16 18:16:59.249165 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:59.249126 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lp5l8" event={"ID":"e39063f8-caec-45bd-ae4f-e11765edec8b","Type":"ContainerDied","Data":"b6beb9666fdfe8834eb35f932603d3016cb1150ad6a3141f57fbfc2b07a9a602"} Apr 16 18:16:59.251384 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:59.251360 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d89lv" event={"ID":"5146021c-a86d-4b5f-a47d-7f8c736f756e","Type":"ContainerStarted","Data":"3d9eef09c533b481d974d5aa34ead2dbf4372c922633315bc40b0ac5f898003a"} Apr 16 18:16:59.255904 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:59.255881 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" event={"ID":"c7893c1b-506b-4195-91f7-dc9927cb3a36","Type":"ContainerStarted","Data":"f820dbecc616a2b24bec9a7bcfa9c87ba4ff94fd48694f3e6f0ccae0f18ff7be"} Apr 16 18:16:59.292073 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:59.292016 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-t6c2c" podStartSLOduration=2.856004753 podStartE2EDuration="20.291996941s" podCreationTimestamp="2026-04-16 18:16:39 +0000 UTC" firstStartedPulling="2026-04-16 18:16:40.355712217 +0000 UTC m=+1.862977312" lastFinishedPulling="2026-04-16 18:16:57.791704406 +0000 UTC m=+19.298969500" observedRunningTime="2026-04-16 18:16:59.291705414 +0000 UTC m=+20.798970531" watchObservedRunningTime="2026-04-16 18:16:59.291996941 +0000 UTC m=+20.799262058" Apr 16 18:16:59.306489 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:59.306434 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-c6ncp" podStartSLOduration=7.453159121 podStartE2EDuration="20.306416061s" podCreationTimestamp="2026-04-16 18:16:39 +0000 UTC" firstStartedPulling="2026-04-16 18:16:40.366908649 +0000 UTC m=+1.874173746" lastFinishedPulling="2026-04-16 18:16:53.220165579 +0000 UTC m=+14.727430686" observedRunningTime="2026-04-16 18:16:59.305869764 +0000 UTC m=+20.813134902" watchObservedRunningTime="2026-04-16 18:16:59.306416061 +0000 UTC m=+20.813681177" Apr 16 18:16:59.325473 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:59.325424 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ll55v" podStartSLOduration=2.917110888 podStartE2EDuration="20.325404234s" podCreationTimestamp="2026-04-16 18:16:39 +0000 UTC" firstStartedPulling="2026-04-16 18:16:40.3819613 +0000 UTC m=+1.889226412" lastFinishedPulling="2026-04-16 18:16:57.79025465 +0000 UTC m=+19.297519758" observedRunningTime="2026-04-16 18:16:59.325106028 +0000 UTC m=+20.832371145" watchObservedRunningTime="2026-04-16 18:16:59.325404234 +0000 UTC m=+20.832669350" Apr 16 18:16:59.340594 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:59.340529 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-d89lv" podStartSLOduration=2.889531977 podStartE2EDuration="20.34051572s" podCreationTimestamp="2026-04-16 18:16:39 +0000 UTC" firstStartedPulling="2026-04-16 18:16:40.324944566 +0000 UTC m=+1.832209661" lastFinishedPulling="2026-04-16 18:16:57.775928305 +0000 UTC m=+19.283193404" observedRunningTime="2026-04-16 18:16:59.340229381 +0000 UTC m=+20.847494509" watchObservedRunningTime="2026-04-16 18:16:59.34051572 +0000 UTC m=+20.847780834" Apr 16 18:16:59.751307 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:16:59.751285 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:17:00.059393 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:00.059206 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:16:59.751303808Z","UUID":"973b17e8-922f-41c7-9a63-a1b3be9d7b59","Handler":null,"Name":"","Endpoint":""} Apr 16 18:17:00.062434 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:00.062401 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:17:00.062434 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:00.062434 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:17:00.116650 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:00.116615 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:17:00.116806 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:00.116615 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:17:00.116806 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:00.116738 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgrxb" podUID="2571d812-7882-455b-be2f-4e3888df0e6a" Apr 16 18:17:00.116922 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:00.116860 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sw2bl" podUID="a7930c14-4ef0-4949-a2ae-9a240da66c3c" Apr 16 18:17:00.260857 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:00.260821 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" event={"ID":"c7893c1b-506b-4195-91f7-dc9927cb3a36","Type":"ContainerStarted","Data":"2b22f1f7e42532b96be0eb9d539748dc1f3834e0737c67666c73c9374fb172f7"} Apr 16 18:17:00.263321 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:00.263248 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-203.ec2.internal" event={"ID":"0e77e2e37a581a4f0ed7090fbdb2ba3d","Type":"ContainerStarted","Data":"c1c964d5fc841c7e627b00d87fb3895e4a69f46201e40a60ee349a69de976bae"} Apr 16 18:17:00.278436 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:00.278388 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-203.ec2.internal" podStartSLOduration=21.278372533 podStartE2EDuration="21.278372533s" podCreationTimestamp="2026-04-16 18:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:17:00.278030325 +0000 UTC m=+21.785295442" watchObservedRunningTime="2026-04-16 18:17:00.278372533 +0000 UTC m=+21.785637650" Apr 16 18:17:01.268366 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:01.268331 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" event={"ID":"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3","Type":"ContainerStarted","Data":"73cd688d491164f8491ce2394de18dcf44c6344e32e615a991c0ef37bdb2089c"} Apr 16 18:17:01.270261 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:01.270230 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" event={"ID":"c7893c1b-506b-4195-91f7-dc9927cb3a36","Type":"ContainerStarted","Data":"5327208de56608dad6949d08ccaf73440651d49b19a7258325a26265ec832da0"} Apr 16 18:17:01.290168 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:01.290122 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6bmgn" podStartSLOduration=2.066418758 podStartE2EDuration="22.290107644s" podCreationTimestamp="2026-04-16 18:16:39 +0000 UTC" firstStartedPulling="2026-04-16 18:16:40.305697804 +0000 UTC m=+1.812962899" lastFinishedPulling="2026-04-16 18:17:00.529386674 +0000 UTC m=+22.036651785" observedRunningTime="2026-04-16 18:17:01.289974026 +0000 UTC m=+22.797239155" watchObservedRunningTime="2026-04-16 18:17:01.290107644 +0000 UTC m=+22.797372759" Apr 16 18:17:02.117245 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:02.117197 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:17:02.117443 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:02.117340 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgrxb" podUID="2571d812-7882-455b-be2f-4e3888df0e6a" Apr 16 18:17:02.117443 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:02.117390 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:17:02.117570 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:02.117494 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sw2bl" podUID="a7930c14-4ef0-4949-a2ae-9a240da66c3c" Apr 16 18:17:02.710894 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:02.710703 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-c6ncp" Apr 16 18:17:02.711337 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:02.711271 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-c6ncp" Apr 16 18:17:03.278087 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:03.278035 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" event={"ID":"7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3","Type":"ContainerStarted","Data":"0239350910d362ed2b5c38271565f9a99e82bac756a50d09e957d80ee15c0636"} Apr 16 18:17:03.279001 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:03.278552 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-c6ncp" Apr 16 18:17:03.279001 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:03.278592 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:17:03.279323 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:03.279303 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-c6ncp" Apr 16 18:17:03.292677 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:03.292655 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:17:03.305960 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:03.305909 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" podStartSLOduration=6.457157505 podStartE2EDuration="24.305896047s" podCreationTimestamp="2026-04-16 18:16:39 +0000 UTC" firstStartedPulling="2026-04-16 18:16:40.360638941 +0000 UTC m=+1.867904050" lastFinishedPulling="2026-04-16 18:16:58.20937749 +0000 UTC m=+19.716642592" observedRunningTime="2026-04-16 18:17:03.304245614 +0000 UTC m=+24.811510729" watchObservedRunningTime="2026-04-16 18:17:03.305896047 +0000 UTC m=+24.813161198" Apr 16 18:17:04.117608 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:04.117427 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:17:04.118350 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:04.117435 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:17:04.118350 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:04.117716 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgrxb" podUID="2571d812-7882-455b-be2f-4e3888df0e6a" Apr 16 18:17:04.118350 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:04.117752 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sw2bl" podUID="a7930c14-4ef0-4949-a2ae-9a240da66c3c" Apr 16 18:17:04.281517 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:04.281487 2573 generic.go:358] "Generic (PLEG): container finished" podID="e39063f8-caec-45bd-ae4f-e11765edec8b" containerID="867a5c577deb5d559181c23441fd266a1e2d14eea21ea3bca5346a81d0bc4bad" exitCode=0 Apr 16 18:17:04.281690 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:04.281577 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lp5l8" event={"ID":"e39063f8-caec-45bd-ae4f-e11765edec8b","Type":"ContainerDied","Data":"867a5c577deb5d559181c23441fd266a1e2d14eea21ea3bca5346a81d0bc4bad"} Apr 16 18:17:04.281903 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:04.281887 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 18:17:04.282584 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:04.282285 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:17:04.296699 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:04.296678 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:17:05.075726 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:05.075695 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sw2bl"] Apr 16 18:17:05.075882 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:05.075822 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:17:05.075925 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:05.075910 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sw2bl" podUID="a7930c14-4ef0-4949-a2ae-9a240da66c3c" Apr 16 18:17:05.077962 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:05.077827 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jgrxb"] Apr 16 18:17:05.077962 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:05.077945 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:17:05.078125 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:05.078023 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgrxb" podUID="2571d812-7882-455b-be2f-4e3888df0e6a" Apr 16 18:17:05.285111 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:05.285083 2573 generic.go:358] "Generic (PLEG): container finished" podID="e39063f8-caec-45bd-ae4f-e11765edec8b" containerID="ee635d4c9c07c062cbc41ed1cc4ed6b6ec6b217574defc203235fcac4750cd94" exitCode=0 Apr 16 18:17:05.285430 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:05.285159 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lp5l8" event={"ID":"e39063f8-caec-45bd-ae4f-e11765edec8b","Type":"ContainerDied","Data":"ee635d4c9c07c062cbc41ed1cc4ed6b6ec6b217574defc203235fcac4750cd94"} Apr 16 18:17:05.285430 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:05.285402 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 18:17:06.288951 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:06.288918 2573 generic.go:358] "Generic (PLEG): container finished" podID="e39063f8-caec-45bd-ae4f-e11765edec8b" containerID="a5baf74489e4a78d06374ef559957f52af876dc3ddc84cec1f043f877e99cf6f" exitCode=0 Apr 16 18:17:06.289320 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:06.288956 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lp5l8" event={"ID":"e39063f8-caec-45bd-ae4f-e11765edec8b","Type":"ContainerDied","Data":"a5baf74489e4a78d06374ef559957f52af876dc3ddc84cec1f043f877e99cf6f"} Apr 16 18:17:06.289320 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:06.289207 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 18:17:07.119637 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:07.119605 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:17:07.119782 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:07.119605 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:17:07.119782 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:07.119713 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgrxb" podUID="2571d812-7882-455b-be2f-4e3888df0e6a" Apr 16 18:17:07.119880 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:07.119802 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sw2bl" podUID="a7930c14-4ef0-4949-a2ae-9a240da66c3c" Apr 16 18:17:08.570509 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:08.570476 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:17:08.570935 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:08.570724 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 18:17:08.590169 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:08.590112 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" podUID="7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 18:17:08.599719 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:08.599689 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" podUID="7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 18:17:09.118065 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:09.118031 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:17:09.118215 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:09.118140 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:17:09.118215 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:09.118229 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sw2bl" podUID="a7930c14-4ef0-4949-a2ae-9a240da66c3c" Apr 16 18:17:09.118498 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:09.118248 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jgrxb" podUID="2571d812-7882-455b-be2f-4e3888df0e6a" Apr 16 18:17:10.779843 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:10.779817 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-203.ec2.internal" event="NodeReady" Apr 16 18:17:10.780186 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:10.779969 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:17:10.823082 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:10.823047 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-whfl9"] Apr 16 18:17:10.873791 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:10.873750 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-fqp4t"] Apr 16 18:17:10.873951 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:10.873916 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-whfl9" Apr 16 18:17:10.876377 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:10.876347 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:17:10.876377 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:10.876367 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:17:10.876639 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:10.876424 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-gz4g7\"" Apr 16 18:17:10.888906 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:10.888883 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-whfl9"] Apr 16 18:17:10.888906 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:10.888911 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fqp4t"] Apr 16 18:17:10.889068 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:10.889018 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fqp4t" Apr 16 18:17:10.891917 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:10.891784 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:17:10.891917 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:10.891820 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:17:10.891917 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:10.891847 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4b92n\"" Apr 16 18:17:10.892252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:10.892236 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:17:11.023696 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.023601 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert\") pod \"ingress-canary-fqp4t\" (UID: \"b8503a04-7aaa-49ef-bec9-fb099ecb0065\") " pod="openshift-ingress-canary/ingress-canary-fqp4t" Apr 16 18:17:11.023696 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.023648 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8fwk\" (UniqueName: \"kubernetes.io/projected/b8503a04-7aaa-49ef-bec9-fb099ecb0065-kube-api-access-q8fwk\") pod \"ingress-canary-fqp4t\" (UID: \"b8503a04-7aaa-49ef-bec9-fb099ecb0065\") " pod="openshift-ingress-canary/ingress-canary-fqp4t" Apr 16 18:17:11.023924 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.023791 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls\") pod \"dns-default-whfl9\" (UID: \"267cfa25-31fb-4ef1-af56-1f468ac12dc6\") " pod="openshift-dns/dns-default-whfl9" Apr 16 18:17:11.023924 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.023816 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/267cfa25-31fb-4ef1-af56-1f468ac12dc6-tmp-dir\") pod \"dns-default-whfl9\" (UID: \"267cfa25-31fb-4ef1-af56-1f468ac12dc6\") " pod="openshift-dns/dns-default-whfl9" Apr 16 18:17:11.023924 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.023842 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/267cfa25-31fb-4ef1-af56-1f468ac12dc6-config-volume\") pod \"dns-default-whfl9\" (UID: \"267cfa25-31fb-4ef1-af56-1f468ac12dc6\") " pod="openshift-dns/dns-default-whfl9" Apr 16 18:17:11.023924 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.023906 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhnzw\" (UniqueName: \"kubernetes.io/projected/267cfa25-31fb-4ef1-af56-1f468ac12dc6-kube-api-access-bhnzw\") pod \"dns-default-whfl9\" (UID: \"267cfa25-31fb-4ef1-af56-1f468ac12dc6\") " pod="openshift-dns/dns-default-whfl9" Apr 16 18:17:11.117050 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.117018 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:17:11.117050 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.117036 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:17:11.119488 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.119465 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:17:11.119631 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.119580 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p8pr6\"" Apr 16 18:17:11.119631 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.119613 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:17:11.119732 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.119705 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pwdr4\"" Apr 16 18:17:11.119859 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.119842 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:17:11.124214 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.124185 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls\") pod \"dns-default-whfl9\" (UID: \"267cfa25-31fb-4ef1-af56-1f468ac12dc6\") " pod="openshift-dns/dns-default-whfl9" Apr 16 18:17:11.124214 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.124212 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/267cfa25-31fb-4ef1-af56-1f468ac12dc6-tmp-dir\") pod \"dns-default-whfl9\" (UID: \"267cfa25-31fb-4ef1-af56-1f468ac12dc6\") " pod="openshift-dns/dns-default-whfl9" Apr 16 18:17:11.124389 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.124246 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/267cfa25-31fb-4ef1-af56-1f468ac12dc6-config-volume\") pod \"dns-default-whfl9\" (UID: \"267cfa25-31fb-4ef1-af56-1f468ac12dc6\") " pod="openshift-dns/dns-default-whfl9" Apr 16 18:17:11.124389 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.124277 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhnzw\" (UniqueName: \"kubernetes.io/projected/267cfa25-31fb-4ef1-af56-1f468ac12dc6-kube-api-access-bhnzw\") pod \"dns-default-whfl9\" (UID: \"267cfa25-31fb-4ef1-af56-1f468ac12dc6\") " pod="openshift-dns/dns-default-whfl9" Apr 16 18:17:11.124389 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.124338 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert\") pod \"ingress-canary-fqp4t\" (UID: \"b8503a04-7aaa-49ef-bec9-fb099ecb0065\") " pod="openshift-ingress-canary/ingress-canary-fqp4t" Apr 16 18:17:11.124389 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:11.124345 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:11.124389 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.124362 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8fwk\" (UniqueName: \"kubernetes.io/projected/b8503a04-7aaa-49ef-bec9-fb099ecb0065-kube-api-access-q8fwk\") pod \"ingress-canary-fqp4t\" (UID: \"b8503a04-7aaa-49ef-bec9-fb099ecb0065\") " pod="openshift-ingress-canary/ingress-canary-fqp4t" Apr 16 18:17:11.124634 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:11.124431 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls podName:267cfa25-31fb-4ef1-af56-1f468ac12dc6 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:11.62441059 +0000 UTC m=+33.131675688 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls") pod "dns-default-whfl9" (UID: "267cfa25-31fb-4ef1-af56-1f468ac12dc6") : secret "dns-default-metrics-tls" not found Apr 16 18:17:11.124634 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:11.124615 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:11.124733 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.124652 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/267cfa25-31fb-4ef1-af56-1f468ac12dc6-tmp-dir\") pod \"dns-default-whfl9\" (UID: \"267cfa25-31fb-4ef1-af56-1f468ac12dc6\") " pod="openshift-dns/dns-default-whfl9" Apr 16 18:17:11.124733 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:11.124668 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert podName:b8503a04-7aaa-49ef-bec9-fb099ecb0065 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:11.624651585 +0000 UTC m=+33.131916680 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert") pod "ingress-canary-fqp4t" (UID: "b8503a04-7aaa-49ef-bec9-fb099ecb0065") : secret "canary-serving-cert" not found Apr 16 18:17:11.133316 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.133297 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/267cfa25-31fb-4ef1-af56-1f468ac12dc6-config-volume\") pod \"dns-default-whfl9\" (UID: \"267cfa25-31fb-4ef1-af56-1f468ac12dc6\") " pod="openshift-dns/dns-default-whfl9" Apr 16 18:17:11.134882 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.134864 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhnzw\" (UniqueName: \"kubernetes.io/projected/267cfa25-31fb-4ef1-af56-1f468ac12dc6-kube-api-access-bhnzw\") pod \"dns-default-whfl9\" (UID: \"267cfa25-31fb-4ef1-af56-1f468ac12dc6\") " pod="openshift-dns/dns-default-whfl9" Apr 16 18:17:11.134967 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.134927 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8fwk\" (UniqueName: \"kubernetes.io/projected/b8503a04-7aaa-49ef-bec9-fb099ecb0065-kube-api-access-q8fwk\") pod \"ingress-canary-fqp4t\" (UID: \"b8503a04-7aaa-49ef-bec9-fb099ecb0065\") " pod="openshift-ingress-canary/ingress-canary-fqp4t" Apr 16 18:17:11.627252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.627209 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert\") pod \"ingress-canary-fqp4t\" (UID: \"b8503a04-7aaa-49ef-bec9-fb099ecb0065\") " pod="openshift-ingress-canary/ingress-canary-fqp4t" Apr 16 18:17:11.627427 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.627282 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls\") pod \"dns-default-whfl9\" (UID: \"267cfa25-31fb-4ef1-af56-1f468ac12dc6\") " pod="openshift-dns/dns-default-whfl9" Apr 16 18:17:11.627427 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:11.627362 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:11.627427 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:11.627385 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:11.627529 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:11.627429 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert podName:b8503a04-7aaa-49ef-bec9-fb099ecb0065 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:12.627412342 +0000 UTC m=+34.134677442 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert") pod "ingress-canary-fqp4t" (UID: "b8503a04-7aaa-49ef-bec9-fb099ecb0065") : secret "canary-serving-cert" not found Apr 16 18:17:11.627529 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:11.627446 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls podName:267cfa25-31fb-4ef1-af56-1f468ac12dc6 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:12.627439811 +0000 UTC m=+34.134704905 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls") pod "dns-default-whfl9" (UID: "267cfa25-31fb-4ef1-af56-1f468ac12dc6") : secret "dns-default-metrics-tls" not found Apr 16 18:17:11.728426 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.728386 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs\") pod \"network-metrics-daemon-sw2bl\" (UID: \"a7930c14-4ef0-4949-a2ae-9a240da66c3c\") " pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:17:11.728607 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:11.728519 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:17:11.728607 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:11.728596 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs podName:a7930c14-4ef0-4949-a2ae-9a240da66c3c nodeName:}" failed. No retries permitted until 2026-04-16 18:17:43.728581443 +0000 UTC m=+65.235846537 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs") pod "network-metrics-daemon-sw2bl" (UID: "a7930c14-4ef0-4949-a2ae-9a240da66c3c") : secret "metrics-daemon-secret" not found Apr 16 18:17:11.829472 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.829440 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgmln\" (UniqueName: \"kubernetes.io/projected/2571d812-7882-455b-be2f-4e3888df0e6a-kube-api-access-wgmln\") pod \"network-check-target-jgrxb\" (UID: \"2571d812-7882-455b-be2f-4e3888df0e6a\") " pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:17:11.831930 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:11.831907 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgmln\" (UniqueName: \"kubernetes.io/projected/2571d812-7882-455b-be2f-4e3888df0e6a-kube-api-access-wgmln\") pod \"network-check-target-jgrxb\" (UID: \"2571d812-7882-455b-be2f-4e3888df0e6a\") " pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:17:12.029410 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:12.029379 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:17:12.213370 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:12.213093 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jgrxb"] Apr 16 18:17:12.216091 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:17:12.216060 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2571d812_7882_455b_be2f_4e3888df0e6a.slice/crio-ae5c4e77ea9623ce3b92c0710db0d8dec0d2825e3d11c8ce892bfa8afe629051 WatchSource:0}: Error finding container ae5c4e77ea9623ce3b92c0710db0d8dec0d2825e3d11c8ce892bfa8afe629051: Status 404 returned error can't find the container with id ae5c4e77ea9623ce3b92c0710db0d8dec0d2825e3d11c8ce892bfa8afe629051 Apr 16 18:17:12.304445 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:12.302630 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jgrxb" event={"ID":"2571d812-7882-455b-be2f-4e3888df0e6a","Type":"ContainerStarted","Data":"ae5c4e77ea9623ce3b92c0710db0d8dec0d2825e3d11c8ce892bfa8afe629051"} Apr 16 18:17:12.304827 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:12.304805 2573 generic.go:358] "Generic (PLEG): container finished" podID="e39063f8-caec-45bd-ae4f-e11765edec8b" containerID="70f8a93f77b8dc0f46c355189d8fb03f954618ace628310b20dcbf887ae73e49" exitCode=0 Apr 16 18:17:12.304901 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:12.304852 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lp5l8" event={"ID":"e39063f8-caec-45bd-ae4f-e11765edec8b","Type":"ContainerDied","Data":"70f8a93f77b8dc0f46c355189d8fb03f954618ace628310b20dcbf887ae73e49"} Apr 16 18:17:12.635978 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:12.635946 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert\") pod \"ingress-canary-fqp4t\" (UID: \"b8503a04-7aaa-49ef-bec9-fb099ecb0065\") " pod="openshift-ingress-canary/ingress-canary-fqp4t" Apr 16 18:17:12.636111 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:12.635994 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls\") pod \"dns-default-whfl9\" (UID: \"267cfa25-31fb-4ef1-af56-1f468ac12dc6\") " pod="openshift-dns/dns-default-whfl9" Apr 16 18:17:12.636111 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:12.636092 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:12.636182 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:12.636154 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert podName:b8503a04-7aaa-49ef-bec9-fb099ecb0065 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:14.636136142 +0000 UTC m=+36.143401239 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert") pod "ingress-canary-fqp4t" (UID: "b8503a04-7aaa-49ef-bec9-fb099ecb0065") : secret "canary-serving-cert" not found Apr 16 18:17:12.636182 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:12.636096 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:12.636319 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:12.636215 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls podName:267cfa25-31fb-4ef1-af56-1f468ac12dc6 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:14.63620388 +0000 UTC m=+36.143468974 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls") pod "dns-default-whfl9" (UID: "267cfa25-31fb-4ef1-af56-1f468ac12dc6") : secret "dns-default-metrics-tls" not found Apr 16 18:17:13.310517 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:13.310482 2573 generic.go:358] "Generic (PLEG): container finished" podID="e39063f8-caec-45bd-ae4f-e11765edec8b" containerID="59887a7b20a1d5085825ca9017ad126638fad2d63b9b3fe86bd91630c659d687" exitCode=0 Apr 16 18:17:13.311028 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:13.310527 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lp5l8" event={"ID":"e39063f8-caec-45bd-ae4f-e11765edec8b","Type":"ContainerDied","Data":"59887a7b20a1d5085825ca9017ad126638fad2d63b9b3fe86bd91630c659d687"} Apr 16 18:17:14.316721 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:14.316682 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lp5l8" event={"ID":"e39063f8-caec-45bd-ae4f-e11765edec8b","Type":"ContainerStarted","Data":"7b6202c79c1f6544d877e5f36375b068db284804c41c15ede709d205673c5853"} Apr 16 18:17:14.340773 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:14.340728 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-lp5l8" podStartSLOduration=3.74458938 podStartE2EDuration="35.340711572s" podCreationTimestamp="2026-04-16 18:16:39 +0000 UTC" firstStartedPulling="2026-04-16 18:16:40.38745843 +0000 UTC m=+1.894723526" lastFinishedPulling="2026-04-16 18:17:11.983580624 +0000 UTC m=+33.490845718" observedRunningTime="2026-04-16 18:17:14.338687375 +0000 UTC m=+35.845952491" watchObservedRunningTime="2026-04-16 18:17:14.340711572 +0000 UTC m=+35.847976690" Apr 16 18:17:14.650850 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:14.650801 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert\") pod \"ingress-canary-fqp4t\" (UID: \"b8503a04-7aaa-49ef-bec9-fb099ecb0065\") " pod="openshift-ingress-canary/ingress-canary-fqp4t" Apr 16 18:17:14.651036 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:14.650876 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls\") pod \"dns-default-whfl9\" (UID: \"267cfa25-31fb-4ef1-af56-1f468ac12dc6\") " pod="openshift-dns/dns-default-whfl9" Apr 16 18:17:14.651036 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:14.650970 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:14.651145 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:14.651051 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert podName:b8503a04-7aaa-49ef-bec9-fb099ecb0065 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:18.651030409 +0000 UTC m=+40.158295503 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert") pod "ingress-canary-fqp4t" (UID: "b8503a04-7aaa-49ef-bec9-fb099ecb0065") : secret "canary-serving-cert" not found Apr 16 18:17:14.651145 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:14.650976 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:14.651145 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:14.651140 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls podName:267cfa25-31fb-4ef1-af56-1f468ac12dc6 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:18.651122355 +0000 UTC m=+40.158387471 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls") pod "dns-default-whfl9" (UID: "267cfa25-31fb-4ef1-af56-1f468ac12dc6") : secret "dns-default-metrics-tls" not found Apr 16 18:17:15.319633 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:15.319456 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jgrxb" event={"ID":"2571d812-7882-455b-be2f-4e3888df0e6a","Type":"ContainerStarted","Data":"931efe996b40c589a3e18745bbe445eb2fe9b15d501a110b2df005479e44b66d"} Apr 16 18:17:15.319964 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:15.319799 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:17:15.336185 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:15.336139 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-jgrxb" podStartSLOduration=33.357189924 podStartE2EDuration="36.336125463s" podCreationTimestamp="2026-04-16 18:16:39 +0000 UTC" firstStartedPulling="2026-04-16 18:17:12.218023968 +0000 UTC m=+33.725289062" lastFinishedPulling="2026-04-16 18:17:15.196959497 +0000 UTC m=+36.704224601" observedRunningTime="2026-04-16 18:17:15.334921307 +0000 UTC m=+36.842186422" watchObservedRunningTime="2026-04-16 18:17:15.336125463 +0000 UTC m=+36.843390572" Apr 16 18:17:18.680190 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:18.680153 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert\") pod \"ingress-canary-fqp4t\" (UID: \"b8503a04-7aaa-49ef-bec9-fb099ecb0065\") " pod="openshift-ingress-canary/ingress-canary-fqp4t" Apr 16 18:17:18.680572 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:18.680221 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls\") pod \"dns-default-whfl9\" (UID: \"267cfa25-31fb-4ef1-af56-1f468ac12dc6\") " pod="openshift-dns/dns-default-whfl9" Apr 16 18:17:18.680572 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:18.680323 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:18.680572 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:18.680394 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls podName:267cfa25-31fb-4ef1-af56-1f468ac12dc6 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:26.680379593 +0000 UTC m=+48.187644689 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls") pod "dns-default-whfl9" (UID: "267cfa25-31fb-4ef1-af56-1f468ac12dc6") : secret "dns-default-metrics-tls" not found Apr 16 18:17:18.680572 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:18.680323 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:18.680572 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:18.680461 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert podName:b8503a04-7aaa-49ef-bec9-fb099ecb0065 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:26.680450601 +0000 UTC m=+48.187715709 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert") pod "ingress-canary-fqp4t" (UID: "b8503a04-7aaa-49ef-bec9-fb099ecb0065") : secret "canary-serving-cert" not found Apr 16 18:17:26.728351 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:26.728310 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls\") pod \"dns-default-whfl9\" (UID: \"267cfa25-31fb-4ef1-af56-1f468ac12dc6\") " pod="openshift-dns/dns-default-whfl9" Apr 16 18:17:26.728806 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:26.728373 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert\") pod \"ingress-canary-fqp4t\" (UID: \"b8503a04-7aaa-49ef-bec9-fb099ecb0065\") " pod="openshift-ingress-canary/ingress-canary-fqp4t" Apr 16 18:17:26.728806 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:26.728458 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:26.728806 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:26.728462 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:26.728806 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:26.728519 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert podName:b8503a04-7aaa-49ef-bec9-fb099ecb0065 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:42.728504 +0000 UTC m=+64.235769095 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert") pod "ingress-canary-fqp4t" (UID: "b8503a04-7aaa-49ef-bec9-fb099ecb0065") : secret "canary-serving-cert" not found Apr 16 18:17:26.728806 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:26.728531 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls podName:267cfa25-31fb-4ef1-af56-1f468ac12dc6 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:42.728525714 +0000 UTC m=+64.235790808 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls") pod "dns-default-whfl9" (UID: "267cfa25-31fb-4ef1-af56-1f468ac12dc6") : secret "dns-default-metrics-tls" not found Apr 16 18:17:38.599607 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:38.599575 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tj7xv" Apr 16 18:17:42.740644 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:42.740602 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert\") pod \"ingress-canary-fqp4t\" (UID: \"b8503a04-7aaa-49ef-bec9-fb099ecb0065\") " pod="openshift-ingress-canary/ingress-canary-fqp4t" Apr 16 18:17:42.741037 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:42.740666 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls\") pod \"dns-default-whfl9\" (UID: \"267cfa25-31fb-4ef1-af56-1f468ac12dc6\") " pod="openshift-dns/dns-default-whfl9" Apr 16 18:17:42.741037 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:42.740762 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:42.741037 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:42.740832 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls podName:267cfa25-31fb-4ef1-af56-1f468ac12dc6 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:14.740816118 +0000 UTC m=+96.248081213 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls") pod "dns-default-whfl9" (UID: "267cfa25-31fb-4ef1-af56-1f468ac12dc6") : secret "dns-default-metrics-tls" not found Apr 16 18:17:42.741037 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:42.740761 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:42.741037 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:42.740877 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert podName:b8503a04-7aaa-49ef-bec9-fb099ecb0065 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:14.74086537 +0000 UTC m=+96.248130472 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert") pod "ingress-canary-fqp4t" (UID: "b8503a04-7aaa-49ef-bec9-fb099ecb0065") : secret "canary-serving-cert" not found Apr 16 18:17:43.748162 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:43.748122 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs\") pod \"network-metrics-daemon-sw2bl\" (UID: \"a7930c14-4ef0-4949-a2ae-9a240da66c3c\") " pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:17:43.748566 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:43.748272 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:17:43.748566 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:17:43.748350 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs podName:a7930c14-4ef0-4949-a2ae-9a240da66c3c nodeName:}" failed. No retries permitted until 2026-04-16 18:18:47.748334695 +0000 UTC m=+129.255599789 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs") pod "network-metrics-daemon-sw2bl" (UID: "a7930c14-4ef0-4949-a2ae-9a240da66c3c") : secret "metrics-daemon-secret" not found Apr 16 18:17:47.325900 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:17:47.325870 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jgrxb" Apr 16 18:18:14.744437 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:14.744383 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert\") pod \"ingress-canary-fqp4t\" (UID: \"b8503a04-7aaa-49ef-bec9-fb099ecb0065\") " pod="openshift-ingress-canary/ingress-canary-fqp4t" Apr 16 18:18:14.744437 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:14.744452 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls\") pod \"dns-default-whfl9\" (UID: \"267cfa25-31fb-4ef1-af56-1f468ac12dc6\") " pod="openshift-dns/dns-default-whfl9" Apr 16 18:18:14.744939 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:14.744534 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:14.744939 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:14.744563 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:14.744939 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:14.744615 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls podName:267cfa25-31fb-4ef1-af56-1f468ac12dc6 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:18.74460108 +0000 UTC m=+160.251866174 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls") pod "dns-default-whfl9" (UID: "267cfa25-31fb-4ef1-af56-1f468ac12dc6") : secret "dns-default-metrics-tls" not found Apr 16 18:18:14.744939 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:14.744640 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert podName:b8503a04-7aaa-49ef-bec9-fb099ecb0065 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:18.744621504 +0000 UTC m=+160.251886620 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert") pod "ingress-canary-fqp4t" (UID: "b8503a04-7aaa-49ef-bec9-fb099ecb0065") : secret "canary-serving-cert" not found Apr 16 18:18:43.535190 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.535156 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-4czjs"] Apr 16 18:18:43.537779 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.537763 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7tj4q"] Apr 16 18:18:43.537917 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.537900 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4czjs" Apr 16 18:18:43.540405 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.540382 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-8282h"] Apr 16 18:18:43.540506 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.540470 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7tj4q" Apr 16 18:18:43.540506 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.540382 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:18:43.542324 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.542306 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:18:43.542417 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.542329 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 18:18:43.542417 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.542329 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 18:18:43.543428 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.543409 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-jdrzv\"" Apr 16 18:18:43.543520 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.543410 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 18:18:43.543603 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.543563 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8282h" Apr 16 18:18:43.543705 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.543686 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-5slg2\"" Apr 16 18:18:43.545908 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.545888 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 18:18:43.547270 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.546638 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:18:43.547270 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.546919 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-xcv67\"" Apr 16 18:18:43.552660 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.552640 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-4czjs"] Apr 16 18:18:43.561351 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.561333 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7tj4q"] Apr 16 18:18:43.566270 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.566247 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-8282h"] Apr 16 18:18:43.634912 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.634879 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/40926aa7-014a-4c73-95f1-c882be5b82a4-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-4czjs\" (UID: \"40926aa7-014a-4c73-95f1-c882be5b82a4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4czjs" Apr 16 18:18:43.634912 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.634914 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5fmw\" (UniqueName: \"kubernetes.io/projected/40926aa7-014a-4c73-95f1-c882be5b82a4-kube-api-access-z5fmw\") pod \"cluster-monitoring-operator-6667474d89-4czjs\" (UID: \"40926aa7-014a-4c73-95f1-c882be5b82a4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4czjs" Apr 16 18:18:43.635118 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.634937 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx69h\" (UniqueName: \"kubernetes.io/projected/ec84b517-4170-4c21-b909-567d1c8fe013-kube-api-access-xx69h\") pod \"network-check-source-7b678d77c7-8282h\" (UID: \"ec84b517-4170-4c21-b909-567d1c8fe013\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8282h" Apr 16 18:18:43.635118 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.635032 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/40926aa7-014a-4c73-95f1-c882be5b82a4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4czjs\" (UID: \"40926aa7-014a-4c73-95f1-c882be5b82a4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4czjs" Apr 16 18:18:43.635118 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.635065 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fszt\" (UniqueName: \"kubernetes.io/projected/c326d3cf-5563-4387-a5e2-ca13828bea8b-kube-api-access-2fszt\") pod \"cluster-samples-operator-667775844f-7tj4q\" (UID: \"c326d3cf-5563-4387-a5e2-ca13828bea8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7tj4q" Apr 16 18:18:43.635118 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.635098 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c326d3cf-5563-4387-a5e2-ca13828bea8b-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7tj4q\" (UID: \"c326d3cf-5563-4387-a5e2-ca13828bea8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7tj4q" Apr 16 18:18:43.636660 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.636638 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-5hlbs"] Apr 16 18:18:43.639556 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.639531 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-dn4qx"] Apr 16 18:18:43.639680 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.639664 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" Apr 16 18:18:43.641985 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.641968 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-6564b967f4-t9vbs"] Apr 16 18:18:43.642136 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.642117 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" Apr 16 18:18:43.642719 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.642706 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 18:18:43.643097 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.643083 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 18:18:43.643311 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.643296 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:18:43.643387 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.643317 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-z5gk7\"" Apr 16 18:18:43.643624 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.643569 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:18:43.644767 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.644752 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:43.646732 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.646712 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 18:18:43.647342 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.647312 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 18:18:43.647770 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.647737 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 18:18:43.647872 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.647740 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:18:43.647872 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.647804 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-smc2n\"" Apr 16 18:18:43.647872 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.647854 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 18:18:43.648187 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.648171 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 18:18:43.648187 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.648177 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 18:18:43.648302 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.648200 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-xg58r\"" Apr 16 18:18:43.648611 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.648593 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 18:18:43.648746 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.648610 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 18:18:43.648746 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.648630 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 18:18:43.654374 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.654350 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-5hlbs"] Apr 16 18:18:43.654499 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.654384 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 18:18:43.655381 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.655359 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 18:18:43.655636 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.655619 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-dn4qx"] Apr 16 18:18:43.673217 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.673197 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6564b967f4-t9vbs"] Apr 16 18:18:43.736075 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.736041 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0-trusted-ca\") pod \"console-operator-d87b8d5fc-dn4qx\" (UID: \"a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" Apr 16 18:18:43.736237 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.736094 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/386777a9-63c1-4fa1-b894-4d73395765d3-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-5hlbs\" (UID: \"386777a9-63c1-4fa1-b894-4d73395765d3\") " pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" Apr 16 18:18:43.736237 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.736115 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38c20e31-955b-4eb0-8e64-330c1b15b52e-service-ca-bundle\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:43.736237 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.736173 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/40926aa7-014a-4c73-95f1-c882be5b82a4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4czjs\" (UID: \"40926aa7-014a-4c73-95f1-c882be5b82a4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4czjs" Apr 16 18:18:43.736237 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.736202 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/386777a9-63c1-4fa1-b894-4d73395765d3-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-5hlbs\" (UID: \"386777a9-63c1-4fa1-b894-4d73395765d3\") " pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" Apr 16 18:18:43.736237 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.736219 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-metrics-certs\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:43.736396 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.736270 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fszt\" (UniqueName: \"kubernetes.io/projected/c326d3cf-5563-4387-a5e2-ca13828bea8b-kube-api-access-2fszt\") pod \"cluster-samples-operator-667775844f-7tj4q\" (UID: \"c326d3cf-5563-4387-a5e2-ca13828bea8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7tj4q" Apr 16 18:18:43.736396 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.736287 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77csj\" (UniqueName: \"kubernetes.io/projected/38c20e31-955b-4eb0-8e64-330c1b15b52e-kube-api-access-77csj\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:43.736396 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:43.736308 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:18:43.736396 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.736319 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c326d3cf-5563-4387-a5e2-ca13828bea8b-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7tj4q\" (UID: \"c326d3cf-5563-4387-a5e2-ca13828bea8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7tj4q" Apr 16 18:18:43.736396 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.736354 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/386777a9-63c1-4fa1-b894-4d73395765d3-serving-cert\") pod \"insights-operator-5785d4fcdd-5hlbs\" (UID: \"386777a9-63c1-4fa1-b894-4d73395765d3\") " pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" Apr 16 18:18:43.736396 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:43.736374 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40926aa7-014a-4c73-95f1-c882be5b82a4-cluster-monitoring-operator-tls podName:40926aa7-014a-4c73-95f1-c882be5b82a4 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:44.236359608 +0000 UTC m=+125.743624702 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/40926aa7-014a-4c73-95f1-c882be5b82a4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-4czjs" (UID: "40926aa7-014a-4c73-95f1-c882be5b82a4") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:18:43.736639 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.736458 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-default-certificate\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:43.736639 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:43.736496 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:18:43.736639 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.736502 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5mpw\" (UniqueName: \"kubernetes.io/projected/386777a9-63c1-4fa1-b894-4d73395765d3-kube-api-access-v5mpw\") pod \"insights-operator-5785d4fcdd-5hlbs\" (UID: \"386777a9-63c1-4fa1-b894-4d73395765d3\") " pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" Apr 16 18:18:43.736639 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:43.736575 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c326d3cf-5563-4387-a5e2-ca13828bea8b-samples-operator-tls podName:c326d3cf-5563-4387-a5e2-ca13828bea8b nodeName:}" failed. No retries permitted until 2026-04-16 18:18:44.236537779 +0000 UTC m=+125.743802887 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/c326d3cf-5563-4387-a5e2-ca13828bea8b-samples-operator-tls") pod "cluster-samples-operator-667775844f-7tj4q" (UID: "c326d3cf-5563-4387-a5e2-ca13828bea8b") : secret "samples-operator-tls" not found Apr 16 18:18:43.736639 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.736589 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/40926aa7-014a-4c73-95f1-c882be5b82a4-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-4czjs\" (UID: \"40926aa7-014a-4c73-95f1-c882be5b82a4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4czjs" Apr 16 18:18:43.736639 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.736608 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/386777a9-63c1-4fa1-b894-4d73395765d3-tmp\") pod \"insights-operator-5785d4fcdd-5hlbs\" (UID: \"386777a9-63c1-4fa1-b894-4d73395765d3\") " pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" Apr 16 18:18:43.736807 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.736641 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5fmw\" (UniqueName: \"kubernetes.io/projected/40926aa7-014a-4c73-95f1-c882be5b82a4-kube-api-access-z5fmw\") pod \"cluster-monitoring-operator-6667474d89-4czjs\" (UID: \"40926aa7-014a-4c73-95f1-c882be5b82a4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4czjs" Apr 16 18:18:43.736807 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.736657 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxbrc\" (UniqueName: \"kubernetes.io/projected/a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0-kube-api-access-kxbrc\") pod \"console-operator-d87b8d5fc-dn4qx\" (UID: \"a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" Apr 16 18:18:43.736807 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.736674 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/386777a9-63c1-4fa1-b894-4d73395765d3-snapshots\") pod \"insights-operator-5785d4fcdd-5hlbs\" (UID: \"386777a9-63c1-4fa1-b894-4d73395765d3\") " pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" Apr 16 18:18:43.736807 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.736690 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0-config\") pod \"console-operator-d87b8d5fc-dn4qx\" (UID: \"a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" Apr 16 18:18:43.736923 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.736802 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0-serving-cert\") pod \"console-operator-d87b8d5fc-dn4qx\" (UID: \"a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" Apr 16 18:18:43.736923 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.736824 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-stats-auth\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:43.736923 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.736849 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xx69h\" (UniqueName: \"kubernetes.io/projected/ec84b517-4170-4c21-b909-567d1c8fe013-kube-api-access-xx69h\") pod \"network-check-source-7b678d77c7-8282h\" (UID: \"ec84b517-4170-4c21-b909-567d1c8fe013\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8282h" Apr 16 18:18:43.737212 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.737195 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/40926aa7-014a-4c73-95f1-c882be5b82a4-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-4czjs\" (UID: \"40926aa7-014a-4c73-95f1-c882be5b82a4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4czjs" Apr 16 18:18:43.747003 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.746977 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx69h\" (UniqueName: \"kubernetes.io/projected/ec84b517-4170-4c21-b909-567d1c8fe013-kube-api-access-xx69h\") pod \"network-check-source-7b678d77c7-8282h\" (UID: \"ec84b517-4170-4c21-b909-567d1c8fe013\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8282h" Apr 16 18:18:43.747120 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.747077 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5fmw\" (UniqueName: \"kubernetes.io/projected/40926aa7-014a-4c73-95f1-c882be5b82a4-kube-api-access-z5fmw\") pod \"cluster-monitoring-operator-6667474d89-4czjs\" (UID: \"40926aa7-014a-4c73-95f1-c882be5b82a4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4czjs" Apr 16 18:18:43.747120 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.747084 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fszt\" (UniqueName: \"kubernetes.io/projected/c326d3cf-5563-4387-a5e2-ca13828bea8b-kube-api-access-2fszt\") pod \"cluster-samples-operator-667775844f-7tj4q\" (UID: \"c326d3cf-5563-4387-a5e2-ca13828bea8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7tj4q" Apr 16 18:18:43.837267 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.837154 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/386777a9-63c1-4fa1-b894-4d73395765d3-serving-cert\") pod \"insights-operator-5785d4fcdd-5hlbs\" (UID: \"386777a9-63c1-4fa1-b894-4d73395765d3\") " pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" Apr 16 18:18:43.837267 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.837207 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-default-certificate\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:43.837267 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.837229 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5mpw\" (UniqueName: \"kubernetes.io/projected/386777a9-63c1-4fa1-b894-4d73395765d3-kube-api-access-v5mpw\") pod \"insights-operator-5785d4fcdd-5hlbs\" (UID: \"386777a9-63c1-4fa1-b894-4d73395765d3\") " pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" Apr 16 18:18:43.837267 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.837270 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/386777a9-63c1-4fa1-b894-4d73395765d3-tmp\") pod \"insights-operator-5785d4fcdd-5hlbs\" (UID: \"386777a9-63c1-4fa1-b894-4d73395765d3\") " pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" Apr 16 18:18:43.837602 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.837306 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxbrc\" (UniqueName: \"kubernetes.io/projected/a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0-kube-api-access-kxbrc\") pod \"console-operator-d87b8d5fc-dn4qx\" (UID: \"a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" Apr 16 18:18:43.837602 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.837331 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/386777a9-63c1-4fa1-b894-4d73395765d3-snapshots\") pod \"insights-operator-5785d4fcdd-5hlbs\" (UID: \"386777a9-63c1-4fa1-b894-4d73395765d3\") " pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" Apr 16 18:18:43.837602 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.837352 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0-config\") pod \"console-operator-d87b8d5fc-dn4qx\" (UID: \"a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" Apr 16 18:18:43.837602 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.837377 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0-serving-cert\") pod \"console-operator-d87b8d5fc-dn4qx\" (UID: \"a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" Apr 16 18:18:43.837602 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.837398 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-stats-auth\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:43.837602 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.837589 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0-trusted-ca\") pod \"console-operator-d87b8d5fc-dn4qx\" (UID: \"a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" Apr 16 18:18:43.837786 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.837632 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/386777a9-63c1-4fa1-b894-4d73395765d3-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-5hlbs\" (UID: \"386777a9-63c1-4fa1-b894-4d73395765d3\") " pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" Apr 16 18:18:43.837786 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.837650 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38c20e31-955b-4eb0-8e64-330c1b15b52e-service-ca-bundle\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:43.837786 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.837682 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/386777a9-63c1-4fa1-b894-4d73395765d3-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-5hlbs\" (UID: \"386777a9-63c1-4fa1-b894-4d73395765d3\") " pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" Apr 16 18:18:43.837786 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.837697 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-metrics-certs\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:43.837786 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.837717 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77csj\" (UniqueName: \"kubernetes.io/projected/38c20e31-955b-4eb0-8e64-330c1b15b52e-kube-api-access-77csj\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:43.838287 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.838223 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/386777a9-63c1-4fa1-b894-4d73395765d3-snapshots\") pod \"insights-operator-5785d4fcdd-5hlbs\" (UID: \"386777a9-63c1-4fa1-b894-4d73395765d3\") " pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" Apr 16 18:18:43.838454 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:43.838430 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:18:43.838537 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:43.838510 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-metrics-certs podName:38c20e31-955b-4eb0-8e64-330c1b15b52e nodeName:}" failed. No retries permitted until 2026-04-16 18:18:44.338489706 +0000 UTC m=+125.845754815 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-metrics-certs") pod "router-default-6564b967f4-t9vbs" (UID: "38c20e31-955b-4eb0-8e64-330c1b15b52e") : secret "router-metrics-certs-default" not found Apr 16 18:18:43.838886 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.838862 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/386777a9-63c1-4fa1-b894-4d73395765d3-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-5hlbs\" (UID: \"386777a9-63c1-4fa1-b894-4d73395765d3\") " pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" Apr 16 18:18:43.839050 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.838967 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0-config\") pod \"console-operator-d87b8d5fc-dn4qx\" (UID: \"a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" Apr 16 18:18:43.839050 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.839004 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/386777a9-63c1-4fa1-b894-4d73395765d3-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-5hlbs\" (UID: \"386777a9-63c1-4fa1-b894-4d73395765d3\") " pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" Apr 16 18:18:43.839050 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.837985 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/386777a9-63c1-4fa1-b894-4d73395765d3-tmp\") pod \"insights-operator-5785d4fcdd-5hlbs\" (UID: \"386777a9-63c1-4fa1-b894-4d73395765d3\") " pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" Apr 16 18:18:43.839253 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:43.839186 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38c20e31-955b-4eb0-8e64-330c1b15b52e-service-ca-bundle podName:38c20e31-955b-4eb0-8e64-330c1b15b52e nodeName:}" failed. No retries permitted until 2026-04-16 18:18:44.339151918 +0000 UTC m=+125.846417016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/38c20e31-955b-4eb0-8e64-330c1b15b52e-service-ca-bundle") pod "router-default-6564b967f4-t9vbs" (UID: "38c20e31-955b-4eb0-8e64-330c1b15b52e") : configmap references non-existent config key: service-ca.crt Apr 16 18:18:43.840218 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.840188 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-default-certificate\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:43.840320 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.840301 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0-trusted-ca\") pod \"console-operator-d87b8d5fc-dn4qx\" (UID: \"a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" Apr 16 18:18:43.842809 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.840951 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/386777a9-63c1-4fa1-b894-4d73395765d3-serving-cert\") pod \"insights-operator-5785d4fcdd-5hlbs\" (UID: \"386777a9-63c1-4fa1-b894-4d73395765d3\") " pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" Apr 16 18:18:43.842809 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.841535 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0-serving-cert\") pod \"console-operator-d87b8d5fc-dn4qx\" (UID: \"a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" Apr 16 18:18:43.844913 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.844886 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-stats-auth\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:43.846210 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.846180 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5mpw\" (UniqueName: \"kubernetes.io/projected/386777a9-63c1-4fa1-b894-4d73395765d3-kube-api-access-v5mpw\") pod \"insights-operator-5785d4fcdd-5hlbs\" (UID: \"386777a9-63c1-4fa1-b894-4d73395765d3\") " pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" Apr 16 18:18:43.846517 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.846501 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxbrc\" (UniqueName: \"kubernetes.io/projected/a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0-kube-api-access-kxbrc\") pod \"console-operator-d87b8d5fc-dn4qx\" (UID: \"a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0\") " pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" Apr 16 18:18:43.846583 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.846536 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77csj\" (UniqueName: \"kubernetes.io/projected/38c20e31-955b-4eb0-8e64-330c1b15b52e-kube-api-access-77csj\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:43.861918 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.861885 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8282h" Apr 16 18:18:43.950151 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.950122 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" Apr 16 18:18:43.958148 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.957269 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" Apr 16 18:18:43.975800 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:43.975606 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-8282h"] Apr 16 18:18:43.978841 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:18:43.978805 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec84b517_4170_4c21_b909_567d1c8fe013.slice/crio-58ef5210c197699d0bf2ada7723ce3b14fa5948f48db26d28f415bdc9b749221 WatchSource:0}: Error finding container 58ef5210c197699d0bf2ada7723ce3b14fa5948f48db26d28f415bdc9b749221: Status 404 returned error can't find the container with id 58ef5210c197699d0bf2ada7723ce3b14fa5948f48db26d28f415bdc9b749221 Apr 16 18:18:44.074956 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:44.074903 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-5hlbs"] Apr 16 18:18:44.077388 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:18:44.077361 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod386777a9_63c1_4fa1_b894_4d73395765d3.slice/crio-04f194abee130cdf8fb19fb7a7c4a389b8ebaf021ec2d1f9d4e8b76053ff8b26 WatchSource:0}: Error finding container 04f194abee130cdf8fb19fb7a7c4a389b8ebaf021ec2d1f9d4e8b76053ff8b26: Status 404 returned error can't find the container with id 04f194abee130cdf8fb19fb7a7c4a389b8ebaf021ec2d1f9d4e8b76053ff8b26 Apr 16 18:18:44.093272 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:44.093191 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-dn4qx"] Apr 16 18:18:44.096308 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:18:44.096280 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda22e2e3b_7179_49ec_8eda_9b8cf17c2ce0.slice/crio-aaa45734f6cb2f0b986de6d1d38c0257c3d7b79cd4494c5f27258226884a2b77 WatchSource:0}: Error finding container aaa45734f6cb2f0b986de6d1d38c0257c3d7b79cd4494c5f27258226884a2b77: Status 404 returned error can't find the container with id aaa45734f6cb2f0b986de6d1d38c0257c3d7b79cd4494c5f27258226884a2b77 Apr 16 18:18:44.242254 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:44.242221 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/40926aa7-014a-4c73-95f1-c882be5b82a4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4czjs\" (UID: \"40926aa7-014a-4c73-95f1-c882be5b82a4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4czjs" Apr 16 18:18:44.242416 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:44.242272 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c326d3cf-5563-4387-a5e2-ca13828bea8b-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7tj4q\" (UID: \"c326d3cf-5563-4387-a5e2-ca13828bea8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7tj4q" Apr 16 18:18:44.242416 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:44.242359 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:18:44.242416 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:44.242362 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:18:44.242577 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:44.242422 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c326d3cf-5563-4387-a5e2-ca13828bea8b-samples-operator-tls podName:c326d3cf-5563-4387-a5e2-ca13828bea8b nodeName:}" failed. No retries permitted until 2026-04-16 18:18:45.242408424 +0000 UTC m=+126.749673519 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/c326d3cf-5563-4387-a5e2-ca13828bea8b-samples-operator-tls") pod "cluster-samples-operator-667775844f-7tj4q" (UID: "c326d3cf-5563-4387-a5e2-ca13828bea8b") : secret "samples-operator-tls" not found Apr 16 18:18:44.242577 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:44.242436 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40926aa7-014a-4c73-95f1-c882be5b82a4-cluster-monitoring-operator-tls podName:40926aa7-014a-4c73-95f1-c882be5b82a4 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:45.242429813 +0000 UTC m=+126.749694908 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/40926aa7-014a-4c73-95f1-c882be5b82a4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-4czjs" (UID: "40926aa7-014a-4c73-95f1-c882be5b82a4") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:18:44.342993 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:44.342952 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38c20e31-955b-4eb0-8e64-330c1b15b52e-service-ca-bundle\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:44.343169 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:44.343006 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-metrics-certs\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:44.343169 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:44.343120 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:18:44.343169 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:44.343132 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38c20e31-955b-4eb0-8e64-330c1b15b52e-service-ca-bundle podName:38c20e31-955b-4eb0-8e64-330c1b15b52e nodeName:}" failed. No retries permitted until 2026-04-16 18:18:45.34311479 +0000 UTC m=+126.850379905 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/38c20e31-955b-4eb0-8e64-330c1b15b52e-service-ca-bundle") pod "router-default-6564b967f4-t9vbs" (UID: "38c20e31-955b-4eb0-8e64-330c1b15b52e") : configmap references non-existent config key: service-ca.crt Apr 16 18:18:44.343169 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:44.343157 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-metrics-certs podName:38c20e31-955b-4eb0-8e64-330c1b15b52e nodeName:}" failed. No retries permitted until 2026-04-16 18:18:45.343147029 +0000 UTC m=+126.850412124 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-metrics-certs") pod "router-default-6564b967f4-t9vbs" (UID: "38c20e31-955b-4eb0-8e64-330c1b15b52e") : secret "router-metrics-certs-default" not found Apr 16 18:18:44.485126 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:44.485088 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" event={"ID":"a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0","Type":"ContainerStarted","Data":"aaa45734f6cb2f0b986de6d1d38c0257c3d7b79cd4494c5f27258226884a2b77"} Apr 16 18:18:44.486366 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:44.486345 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8282h" event={"ID":"ec84b517-4170-4c21-b909-567d1c8fe013","Type":"ContainerStarted","Data":"732f29d9cc5932476877e4d23b2c6c62cd0de5202d77e5647d5625bc2ec31ad6"} Apr 16 18:18:44.486366 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:44.486371 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8282h" event={"ID":"ec84b517-4170-4c21-b909-567d1c8fe013","Type":"ContainerStarted","Data":"58ef5210c197699d0bf2ada7723ce3b14fa5948f48db26d28f415bdc9b749221"} Apr 16 18:18:44.487385 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:44.487369 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" event={"ID":"386777a9-63c1-4fa1-b894-4d73395765d3","Type":"ContainerStarted","Data":"04f194abee130cdf8fb19fb7a7c4a389b8ebaf021ec2d1f9d4e8b76053ff8b26"} Apr 16 18:18:44.501617 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:44.501575 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-8282h" podStartSLOduration=1.5015624829999998 podStartE2EDuration="1.501562483s" podCreationTimestamp="2026-04-16 18:18:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:18:44.50142588 +0000 UTC m=+126.008690996" watchObservedRunningTime="2026-04-16 18:18:44.501562483 +0000 UTC m=+126.008827598" Apr 16 18:18:45.250415 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:45.250370 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/40926aa7-014a-4c73-95f1-c882be5b82a4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4czjs\" (UID: \"40926aa7-014a-4c73-95f1-c882be5b82a4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4czjs" Apr 16 18:18:45.250882 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:45.250461 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c326d3cf-5563-4387-a5e2-ca13828bea8b-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7tj4q\" (UID: \"c326d3cf-5563-4387-a5e2-ca13828bea8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7tj4q" Apr 16 18:18:45.250882 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:45.250561 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:18:45.250882 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:45.250589 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:18:45.250882 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:45.250648 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40926aa7-014a-4c73-95f1-c882be5b82a4-cluster-monitoring-operator-tls podName:40926aa7-014a-4c73-95f1-c882be5b82a4 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:47.250624813 +0000 UTC m=+128.757889911 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/40926aa7-014a-4c73-95f1-c882be5b82a4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-4czjs" (UID: "40926aa7-014a-4c73-95f1-c882be5b82a4") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:18:45.250882 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:45.250671 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c326d3cf-5563-4387-a5e2-ca13828bea8b-samples-operator-tls podName:c326d3cf-5563-4387-a5e2-ca13828bea8b nodeName:}" failed. No retries permitted until 2026-04-16 18:18:47.250660767 +0000 UTC m=+128.757925870 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/c326d3cf-5563-4387-a5e2-ca13828bea8b-samples-operator-tls") pod "cluster-samples-operator-667775844f-7tj4q" (UID: "c326d3cf-5563-4387-a5e2-ca13828bea8b") : secret "samples-operator-tls" not found Apr 16 18:18:45.351693 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:45.351652 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38c20e31-955b-4eb0-8e64-330c1b15b52e-service-ca-bundle\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:45.351890 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:45.351713 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-metrics-certs\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:45.351890 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:45.351828 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38c20e31-955b-4eb0-8e64-330c1b15b52e-service-ca-bundle podName:38c20e31-955b-4eb0-8e64-330c1b15b52e nodeName:}" failed. No retries permitted until 2026-04-16 18:18:47.351807286 +0000 UTC m=+128.859072385 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/38c20e31-955b-4eb0-8e64-330c1b15b52e-service-ca-bundle") pod "router-default-6564b967f4-t9vbs" (UID: "38c20e31-955b-4eb0-8e64-330c1b15b52e") : configmap references non-existent config key: service-ca.crt Apr 16 18:18:45.352018 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:45.351886 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:18:45.352018 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:45.351952 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-metrics-certs podName:38c20e31-955b-4eb0-8e64-330c1b15b52e nodeName:}" failed. No retries permitted until 2026-04-16 18:18:47.351932971 +0000 UTC m=+128.859198083 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-metrics-certs") pod "router-default-6564b967f4-t9vbs" (UID: "38c20e31-955b-4eb0-8e64-330c1b15b52e") : secret "router-metrics-certs-default" not found Apr 16 18:18:47.268523 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:47.268481 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/40926aa7-014a-4c73-95f1-c882be5b82a4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4czjs\" (UID: \"40926aa7-014a-4c73-95f1-c882be5b82a4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4czjs" Apr 16 18:18:47.268943 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:47.268561 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c326d3cf-5563-4387-a5e2-ca13828bea8b-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7tj4q\" (UID: \"c326d3cf-5563-4387-a5e2-ca13828bea8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7tj4q" Apr 16 18:18:47.268943 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:47.268631 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:18:47.268943 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:47.268657 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:18:47.268943 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:47.268706 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c326d3cf-5563-4387-a5e2-ca13828bea8b-samples-operator-tls podName:c326d3cf-5563-4387-a5e2-ca13828bea8b nodeName:}" failed. No retries permitted until 2026-04-16 18:18:51.268691651 +0000 UTC m=+132.775956745 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/c326d3cf-5563-4387-a5e2-ca13828bea8b-samples-operator-tls") pod "cluster-samples-operator-667775844f-7tj4q" (UID: "c326d3cf-5563-4387-a5e2-ca13828bea8b") : secret "samples-operator-tls" not found Apr 16 18:18:47.268943 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:47.268721 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40926aa7-014a-4c73-95f1-c882be5b82a4-cluster-monitoring-operator-tls podName:40926aa7-014a-4c73-95f1-c882be5b82a4 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:51.268713103 +0000 UTC m=+132.775978197 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/40926aa7-014a-4c73-95f1-c882be5b82a4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-4czjs" (UID: "40926aa7-014a-4c73-95f1-c882be5b82a4") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:18:47.368892 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:47.368855 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38c20e31-955b-4eb0-8e64-330c1b15b52e-service-ca-bundle\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:47.369058 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:47.368916 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-metrics-certs\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:47.369058 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:47.369044 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:18:47.369148 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:47.369050 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38c20e31-955b-4eb0-8e64-330c1b15b52e-service-ca-bundle podName:38c20e31-955b-4eb0-8e64-330c1b15b52e nodeName:}" failed. No retries permitted until 2026-04-16 18:18:51.369031075 +0000 UTC m=+132.876296192 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/38c20e31-955b-4eb0-8e64-330c1b15b52e-service-ca-bundle") pod "router-default-6564b967f4-t9vbs" (UID: "38c20e31-955b-4eb0-8e64-330c1b15b52e") : configmap references non-existent config key: service-ca.crt Apr 16 18:18:47.369148 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:47.369095 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-metrics-certs podName:38c20e31-955b-4eb0-8e64-330c1b15b52e nodeName:}" failed. No retries permitted until 2026-04-16 18:18:51.369085066 +0000 UTC m=+132.876350165 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-metrics-certs") pod "router-default-6564b967f4-t9vbs" (UID: "38c20e31-955b-4eb0-8e64-330c1b15b52e") : secret "router-metrics-certs-default" not found Apr 16 18:18:47.494363 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:47.494340 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-dn4qx_a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0/console-operator/0.log" Apr 16 18:18:47.494513 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:47.494380 2573 generic.go:358] "Generic (PLEG): container finished" podID="a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0" containerID="d756d0252775856cd58ecb35937c83755900465610849d2dbddd9b0781b9069b" exitCode=255 Apr 16 18:18:47.494513 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:47.494447 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" event={"ID":"a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0","Type":"ContainerDied","Data":"d756d0252775856cd58ecb35937c83755900465610849d2dbddd9b0781b9069b"} Apr 16 18:18:47.494726 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:47.494698 2573 scope.go:117] "RemoveContainer" containerID="d756d0252775856cd58ecb35937c83755900465610849d2dbddd9b0781b9069b" Apr 16 18:18:47.495891 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:47.495869 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" event={"ID":"386777a9-63c1-4fa1-b894-4d73395765d3","Type":"ContainerStarted","Data":"21a96ecac47f482cb1ce6a75a1cfbfa19b243081ca30ebcda605ebf237df3daa"} Apr 16 18:18:47.529809 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:47.529667 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" podStartSLOduration=2.0868868 podStartE2EDuration="4.529652992s" podCreationTimestamp="2026-04-16 18:18:43 +0000 UTC" firstStartedPulling="2026-04-16 18:18:44.079227593 +0000 UTC m=+125.586492688" lastFinishedPulling="2026-04-16 18:18:46.521993785 +0000 UTC m=+128.029258880" observedRunningTime="2026-04-16 18:18:47.529194001 +0000 UTC m=+129.036459130" watchObservedRunningTime="2026-04-16 18:18:47.529652992 +0000 UTC m=+129.036918106" Apr 16 18:18:47.774031 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:47.773929 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs\") pod \"network-metrics-daemon-sw2bl\" (UID: \"a7930c14-4ef0-4949-a2ae-9a240da66c3c\") " pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:18:47.774185 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:47.774067 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:18:47.774185 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:47.774136 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs podName:a7930c14-4ef0-4949-a2ae-9a240da66c3c nodeName:}" failed. No retries permitted until 2026-04-16 18:20:49.774116395 +0000 UTC m=+251.281381492 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs") pod "network-metrics-daemon-sw2bl" (UID: "a7930c14-4ef0-4949-a2ae-9a240da66c3c") : secret "metrics-daemon-secret" not found Apr 16 18:18:48.499110 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:48.499081 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-dn4qx_a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0/console-operator/1.log" Apr 16 18:18:48.499486 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:48.499459 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-dn4qx_a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0/console-operator/0.log" Apr 16 18:18:48.499526 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:48.499488 2573 generic.go:358] "Generic (PLEG): container finished" podID="a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0" containerID="a22991f980087919dda993a80811e57aad3a0b405d300ce0be3e7fd5947c9eb6" exitCode=255 Apr 16 18:18:48.499607 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:48.499582 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" event={"ID":"a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0","Type":"ContainerDied","Data":"a22991f980087919dda993a80811e57aad3a0b405d300ce0be3e7fd5947c9eb6"} Apr 16 18:18:48.499649 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:48.499631 2573 scope.go:117] "RemoveContainer" containerID="d756d0252775856cd58ecb35937c83755900465610849d2dbddd9b0781b9069b" Apr 16 18:18:48.499923 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:48.499906 2573 scope.go:117] "RemoveContainer" containerID="a22991f980087919dda993a80811e57aad3a0b405d300ce0be3e7fd5947c9eb6" Apr 16 18:18:48.500108 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:48.500090 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-dn4qx_openshift-console-operator(a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" podUID="a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0" Apr 16 18:18:49.391869 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:49.391844 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ll55v_18fc33d6-c4dd-487b-8457-811880ffd3ea/dns-node-resolver/0.log" Apr 16 18:18:49.502984 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:49.502961 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-dn4qx_a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0/console-operator/1.log" Apr 16 18:18:49.503339 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:49.503263 2573 scope.go:117] "RemoveContainer" containerID="a22991f980087919dda993a80811e57aad3a0b405d300ce0be3e7fd5947c9eb6" Apr 16 18:18:49.503442 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:49.503425 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-dn4qx_openshift-console-operator(a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" podUID="a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0" Apr 16 18:18:50.592096 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:50.592066 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-d89lv_5146021c-a86d-4b5f-a47d-7f8c736f756e/node-ca/0.log" Apr 16 18:18:50.962471 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:50.962440 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-rgd5m"] Apr 16 18:18:50.965117 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:50.965102 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-rgd5m" Apr 16 18:18:50.968080 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:50.968059 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 18:18:50.969122 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:50.969107 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-26bsc\"" Apr 16 18:18:50.969213 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:50.969123 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 18:18:50.969213 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:50.969159 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 18:18:50.969414 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:50.969401 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 18:18:50.972761 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:50.972740 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-rgd5m"] Apr 16 18:18:51.099080 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:51.099047 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/89d43139-8357-4dc9-915c-1934f278937d-signing-key\") pod \"service-ca-bfc587fb7-rgd5m\" (UID: \"89d43139-8357-4dc9-915c-1934f278937d\") " pod="openshift-service-ca/service-ca-bfc587fb7-rgd5m" Apr 16 18:18:51.099231 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:51.099110 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/89d43139-8357-4dc9-915c-1934f278937d-signing-cabundle\") pod \"service-ca-bfc587fb7-rgd5m\" (UID: \"89d43139-8357-4dc9-915c-1934f278937d\") " pod="openshift-service-ca/service-ca-bfc587fb7-rgd5m" Apr 16 18:18:51.099231 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:51.099146 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5rpg\" (UniqueName: \"kubernetes.io/projected/89d43139-8357-4dc9-915c-1934f278937d-kube-api-access-h5rpg\") pod \"service-ca-bfc587fb7-rgd5m\" (UID: \"89d43139-8357-4dc9-915c-1934f278937d\") " pod="openshift-service-ca/service-ca-bfc587fb7-rgd5m" Apr 16 18:18:51.200027 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:51.199993 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/89d43139-8357-4dc9-915c-1934f278937d-signing-key\") pod \"service-ca-bfc587fb7-rgd5m\" (UID: \"89d43139-8357-4dc9-915c-1934f278937d\") " pod="openshift-service-ca/service-ca-bfc587fb7-rgd5m" Apr 16 18:18:51.200190 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:51.200078 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/89d43139-8357-4dc9-915c-1934f278937d-signing-cabundle\") pod \"service-ca-bfc587fb7-rgd5m\" (UID: \"89d43139-8357-4dc9-915c-1934f278937d\") " pod="openshift-service-ca/service-ca-bfc587fb7-rgd5m" Apr 16 18:18:51.200252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:51.200205 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5rpg\" (UniqueName: \"kubernetes.io/projected/89d43139-8357-4dc9-915c-1934f278937d-kube-api-access-h5rpg\") pod \"service-ca-bfc587fb7-rgd5m\" (UID: \"89d43139-8357-4dc9-915c-1934f278937d\") " pod="openshift-service-ca/service-ca-bfc587fb7-rgd5m" Apr 16 18:18:51.200692 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:51.200671 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/89d43139-8357-4dc9-915c-1934f278937d-signing-cabundle\") pod \"service-ca-bfc587fb7-rgd5m\" (UID: \"89d43139-8357-4dc9-915c-1934f278937d\") " pod="openshift-service-ca/service-ca-bfc587fb7-rgd5m" Apr 16 18:18:51.202246 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:51.202219 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/89d43139-8357-4dc9-915c-1934f278937d-signing-key\") pod \"service-ca-bfc587fb7-rgd5m\" (UID: \"89d43139-8357-4dc9-915c-1934f278937d\") " pod="openshift-service-ca/service-ca-bfc587fb7-rgd5m" Apr 16 18:18:51.207755 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:51.207736 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5rpg\" (UniqueName: \"kubernetes.io/projected/89d43139-8357-4dc9-915c-1934f278937d-kube-api-access-h5rpg\") pod \"service-ca-bfc587fb7-rgd5m\" (UID: \"89d43139-8357-4dc9-915c-1934f278937d\") " pod="openshift-service-ca/service-ca-bfc587fb7-rgd5m" Apr 16 18:18:51.273279 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:51.273188 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-rgd5m" Apr 16 18:18:51.301131 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:51.301099 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/40926aa7-014a-4c73-95f1-c882be5b82a4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4czjs\" (UID: \"40926aa7-014a-4c73-95f1-c882be5b82a4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4czjs" Apr 16 18:18:51.301267 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:51.301142 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c326d3cf-5563-4387-a5e2-ca13828bea8b-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7tj4q\" (UID: \"c326d3cf-5563-4387-a5e2-ca13828bea8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7tj4q" Apr 16 18:18:51.301306 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:51.301270 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:18:51.301306 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:51.301282 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:18:51.301368 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:51.301351 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c326d3cf-5563-4387-a5e2-ca13828bea8b-samples-operator-tls podName:c326d3cf-5563-4387-a5e2-ca13828bea8b nodeName:}" failed. No retries permitted until 2026-04-16 18:18:59.301334536 +0000 UTC m=+140.808599647 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/c326d3cf-5563-4387-a5e2-ca13828bea8b-samples-operator-tls") pod "cluster-samples-operator-667775844f-7tj4q" (UID: "c326d3cf-5563-4387-a5e2-ca13828bea8b") : secret "samples-operator-tls" not found Apr 16 18:18:51.301368 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:51.301366 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40926aa7-014a-4c73-95f1-c882be5b82a4-cluster-monitoring-operator-tls podName:40926aa7-014a-4c73-95f1-c882be5b82a4 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:59.301359349 +0000 UTC m=+140.808624443 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/40926aa7-014a-4c73-95f1-c882be5b82a4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-4czjs" (UID: "40926aa7-014a-4c73-95f1-c882be5b82a4") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:18:51.385004 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:51.384805 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-rgd5m"] Apr 16 18:18:51.389789 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:18:51.389759 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89d43139_8357_4dc9_915c_1934f278937d.slice/crio-5034108684e0ce2e9e5ed98a2dd22d86227abed4053c18556899679992616a38 WatchSource:0}: Error finding container 5034108684e0ce2e9e5ed98a2dd22d86227abed4053c18556899679992616a38: Status 404 returned error can't find the container with id 5034108684e0ce2e9e5ed98a2dd22d86227abed4053c18556899679992616a38 Apr 16 18:18:51.401722 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:51.401702 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-metrics-certs\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:51.401831 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:51.401751 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38c20e31-955b-4eb0-8e64-330c1b15b52e-service-ca-bundle\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:51.401895 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:51.401849 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38c20e31-955b-4eb0-8e64-330c1b15b52e-service-ca-bundle podName:38c20e31-955b-4eb0-8e64-330c1b15b52e nodeName:}" failed. No retries permitted until 2026-04-16 18:18:59.401837109 +0000 UTC m=+140.909102203 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/38c20e31-955b-4eb0-8e64-330c1b15b52e-service-ca-bundle") pod "router-default-6564b967f4-t9vbs" (UID: "38c20e31-955b-4eb0-8e64-330c1b15b52e") : configmap references non-existent config key: service-ca.crt Apr 16 18:18:51.401895 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:51.401856 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:18:51.402003 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:51.401943 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-metrics-certs podName:38c20e31-955b-4eb0-8e64-330c1b15b52e nodeName:}" failed. No retries permitted until 2026-04-16 18:18:59.401925849 +0000 UTC m=+140.909190988 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-metrics-certs") pod "router-default-6564b967f4-t9vbs" (UID: "38c20e31-955b-4eb0-8e64-330c1b15b52e") : secret "router-metrics-certs-default" not found Apr 16 18:18:51.507971 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:51.507938 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-rgd5m" event={"ID":"89d43139-8357-4dc9-915c-1934f278937d","Type":"ContainerStarted","Data":"5034108684e0ce2e9e5ed98a2dd22d86227abed4053c18556899679992616a38"} Apr 16 18:18:53.513982 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:53.513892 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-rgd5m" event={"ID":"89d43139-8357-4dc9-915c-1934f278937d","Type":"ContainerStarted","Data":"23d6e8afc3f12b59db79f324c664792260420a675e231201eb7a37e07fca845f"} Apr 16 18:18:53.529857 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:53.529810 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-rgd5m" podStartSLOduration=1.657186028 podStartE2EDuration="3.529795383s" podCreationTimestamp="2026-04-16 18:18:50 +0000 UTC" firstStartedPulling="2026-04-16 18:18:51.391467679 +0000 UTC m=+132.898732774" lastFinishedPulling="2026-04-16 18:18:53.264077031 +0000 UTC m=+134.771342129" observedRunningTime="2026-04-16 18:18:53.528919263 +0000 UTC m=+135.036184379" watchObservedRunningTime="2026-04-16 18:18:53.529795383 +0000 UTC m=+135.037060499" Apr 16 18:18:53.958651 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:53.958616 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" Apr 16 18:18:53.958651 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:53.958649 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" Apr 16 18:18:53.959014 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:53.959002 2573 scope.go:117] "RemoveContainer" containerID="a22991f980087919dda993a80811e57aad3a0b405d300ce0be3e7fd5947c9eb6" Apr 16 18:18:53.959184 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:53.959167 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-dn4qx_openshift-console-operator(a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" podUID="a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0" Apr 16 18:18:59.370990 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:59.370949 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/40926aa7-014a-4c73-95f1-c882be5b82a4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4czjs\" (UID: \"40926aa7-014a-4c73-95f1-c882be5b82a4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4czjs" Apr 16 18:18:59.371420 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:59.371007 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c326d3cf-5563-4387-a5e2-ca13828bea8b-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7tj4q\" (UID: \"c326d3cf-5563-4387-a5e2-ca13828bea8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7tj4q" Apr 16 18:18:59.371420 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:59.371091 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:18:59.371420 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:59.371173 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40926aa7-014a-4c73-95f1-c882be5b82a4-cluster-monitoring-operator-tls podName:40926aa7-014a-4c73-95f1-c882be5b82a4 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:15.371151729 +0000 UTC m=+156.878416823 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/40926aa7-014a-4c73-95f1-c882be5b82a4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-4czjs" (UID: "40926aa7-014a-4c73-95f1-c882be5b82a4") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:18:59.373801 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:59.373774 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c326d3cf-5563-4387-a5e2-ca13828bea8b-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7tj4q\" (UID: \"c326d3cf-5563-4387-a5e2-ca13828bea8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7tj4q" Apr 16 18:18:59.456332 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:59.456303 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7tj4q" Apr 16 18:18:59.472246 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:59.472214 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38c20e31-955b-4eb0-8e64-330c1b15b52e-service-ca-bundle\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:59.472358 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:59.472343 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38c20e31-955b-4eb0-8e64-330c1b15b52e-service-ca-bundle podName:38c20e31-955b-4eb0-8e64-330c1b15b52e nodeName:}" failed. No retries permitted until 2026-04-16 18:19:15.472322704 +0000 UTC m=+156.979587800 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/38c20e31-955b-4eb0-8e64-330c1b15b52e-service-ca-bundle") pod "router-default-6564b967f4-t9vbs" (UID: "38c20e31-955b-4eb0-8e64-330c1b15b52e") : configmap references non-existent config key: service-ca.crt Apr 16 18:18:59.472405 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:59.472371 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-metrics-certs\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:18:59.472492 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:59.472480 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:18:59.472530 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:18:59.472523 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-metrics-certs podName:38c20e31-955b-4eb0-8e64-330c1b15b52e nodeName:}" failed. No retries permitted until 2026-04-16 18:19:15.472511928 +0000 UTC m=+156.979777029 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-metrics-certs") pod "router-default-6564b967f4-t9vbs" (UID: "38c20e31-955b-4eb0-8e64-330c1b15b52e") : secret "router-metrics-certs-default" not found Apr 16 18:18:59.573707 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:18:59.573678 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7tj4q"] Apr 16 18:19:00.530434 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:00.530391 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7tj4q" event={"ID":"c326d3cf-5563-4387-a5e2-ca13828bea8b","Type":"ContainerStarted","Data":"bd5f8c0a8c1c9234b5332dadedf720654e3caa172a3a01cc4605f59468319baa"} Apr 16 18:19:02.536521 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:02.536490 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7tj4q" event={"ID":"c326d3cf-5563-4387-a5e2-ca13828bea8b","Type":"ContainerStarted","Data":"425f6883b35a3f576f21209ad6c4ed509695f8187e1cb6a7e8e180fac7d92574"} Apr 16 18:19:02.536932 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:02.536527 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7tj4q" event={"ID":"c326d3cf-5563-4387-a5e2-ca13828bea8b","Type":"ContainerStarted","Data":"a80e4da00436aeee1e6d0c90ae61214d28d0b300f415ff72dc88e002b3102595"} Apr 16 18:19:02.561332 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:02.561279 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7tj4q" podStartSLOduration=17.662136067 podStartE2EDuration="19.561265656s" podCreationTimestamp="2026-04-16 18:18:43 +0000 UTC" firstStartedPulling="2026-04-16 18:18:59.618883323 +0000 UTC m=+141.126148418" lastFinishedPulling="2026-04-16 18:19:01.518012913 +0000 UTC m=+143.025278007" observedRunningTime="2026-04-16 18:19:02.556317399 +0000 UTC m=+144.063582515" watchObservedRunningTime="2026-04-16 18:19:02.561265656 +0000 UTC m=+144.068530772" Apr 16 18:19:09.118173 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:09.118146 2573 scope.go:117] "RemoveContainer" containerID="a22991f980087919dda993a80811e57aad3a0b405d300ce0be3e7fd5947c9eb6" Apr 16 18:19:09.556981 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:09.556955 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-dn4qx_a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0/console-operator/2.log" Apr 16 18:19:09.557316 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:09.557300 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-dn4qx_a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0/console-operator/1.log" Apr 16 18:19:09.557459 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:09.557334 2573 generic.go:358] "Generic (PLEG): container finished" podID="a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0" containerID="660049fc0f4f5940b3195b39b9c7309a7a912fd03509a07929a7bb3b272315a8" exitCode=255 Apr 16 18:19:09.557459 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:09.557378 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" event={"ID":"a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0","Type":"ContainerDied","Data":"660049fc0f4f5940b3195b39b9c7309a7a912fd03509a07929a7bb3b272315a8"} Apr 16 18:19:09.557459 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:09.557405 2573 scope.go:117] "RemoveContainer" containerID="a22991f980087919dda993a80811e57aad3a0b405d300ce0be3e7fd5947c9eb6" Apr 16 18:19:09.557790 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:09.557774 2573 scope.go:117] "RemoveContainer" containerID="660049fc0f4f5940b3195b39b9c7309a7a912fd03509a07929a7bb3b272315a8" Apr 16 18:19:09.557975 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:19:09.557957 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-dn4qx_openshift-console-operator(a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" podUID="a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0" Apr 16 18:19:10.561019 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:10.560994 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-dn4qx_a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0/console-operator/2.log" Apr 16 18:19:12.064789 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.064758 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-l8qct"] Apr 16 18:19:12.069428 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.069412 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-l8qct" Apr 16 18:19:12.074394 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.074374 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:19:12.074394 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.074373 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-pnvp2\"" Apr 16 18:19:12.075093 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.075069 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:19:12.085611 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.085588 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-l8qct"] Apr 16 18:19:12.172507 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.172472 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cf14f56c-5787-44f0-936c-3bb1bd030d2b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l8qct\" (UID: \"cf14f56c-5787-44f0-936c-3bb1bd030d2b\") " pod="openshift-insights/insights-runtime-extractor-l8qct" Apr 16 18:19:12.172672 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.172515 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bllr6\" (UniqueName: \"kubernetes.io/projected/cf14f56c-5787-44f0-936c-3bb1bd030d2b-kube-api-access-bllr6\") pod \"insights-runtime-extractor-l8qct\" (UID: \"cf14f56c-5787-44f0-936c-3bb1bd030d2b\") " pod="openshift-insights/insights-runtime-extractor-l8qct" Apr 16 18:19:12.172672 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.172622 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cf14f56c-5787-44f0-936c-3bb1bd030d2b-data-volume\") pod \"insights-runtime-extractor-l8qct\" (UID: \"cf14f56c-5787-44f0-936c-3bb1bd030d2b\") " pod="openshift-insights/insights-runtime-extractor-l8qct" Apr 16 18:19:12.172672 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.172655 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cf14f56c-5787-44f0-936c-3bb1bd030d2b-crio-socket\") pod \"insights-runtime-extractor-l8qct\" (UID: \"cf14f56c-5787-44f0-936c-3bb1bd030d2b\") " pod="openshift-insights/insights-runtime-extractor-l8qct" Apr 16 18:19:12.172799 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.172678 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cf14f56c-5787-44f0-936c-3bb1bd030d2b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l8qct\" (UID: \"cf14f56c-5787-44f0-936c-3bb1bd030d2b\") " pod="openshift-insights/insights-runtime-extractor-l8qct" Apr 16 18:19:12.273497 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.273463 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cf14f56c-5787-44f0-936c-3bb1bd030d2b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l8qct\" (UID: \"cf14f56c-5787-44f0-936c-3bb1bd030d2b\") " pod="openshift-insights/insights-runtime-extractor-l8qct" Apr 16 18:19:12.273714 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.273507 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bllr6\" (UniqueName: \"kubernetes.io/projected/cf14f56c-5787-44f0-936c-3bb1bd030d2b-kube-api-access-bllr6\") pod \"insights-runtime-extractor-l8qct\" (UID: \"cf14f56c-5787-44f0-936c-3bb1bd030d2b\") " pod="openshift-insights/insights-runtime-extractor-l8qct" Apr 16 18:19:12.273714 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.273698 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cf14f56c-5787-44f0-936c-3bb1bd030d2b-data-volume\") pod \"insights-runtime-extractor-l8qct\" (UID: \"cf14f56c-5787-44f0-936c-3bb1bd030d2b\") " pod="openshift-insights/insights-runtime-extractor-l8qct" Apr 16 18:19:12.273848 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.273749 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cf14f56c-5787-44f0-936c-3bb1bd030d2b-crio-socket\") pod \"insights-runtime-extractor-l8qct\" (UID: \"cf14f56c-5787-44f0-936c-3bb1bd030d2b\") " pod="openshift-insights/insights-runtime-extractor-l8qct" Apr 16 18:19:12.273848 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.273784 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cf14f56c-5787-44f0-936c-3bb1bd030d2b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l8qct\" (UID: \"cf14f56c-5787-44f0-936c-3bb1bd030d2b\") " pod="openshift-insights/insights-runtime-extractor-l8qct" Apr 16 18:19:12.273928 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.273878 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cf14f56c-5787-44f0-936c-3bb1bd030d2b-crio-socket\") pod \"insights-runtime-extractor-l8qct\" (UID: \"cf14f56c-5787-44f0-936c-3bb1bd030d2b\") " pod="openshift-insights/insights-runtime-extractor-l8qct" Apr 16 18:19:12.274082 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.274047 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cf14f56c-5787-44f0-936c-3bb1bd030d2b-data-volume\") pod \"insights-runtime-extractor-l8qct\" (UID: \"cf14f56c-5787-44f0-936c-3bb1bd030d2b\") " pod="openshift-insights/insights-runtime-extractor-l8qct" Apr 16 18:19:12.274290 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.274272 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cf14f56c-5787-44f0-936c-3bb1bd030d2b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l8qct\" (UID: \"cf14f56c-5787-44f0-936c-3bb1bd030d2b\") " pod="openshift-insights/insights-runtime-extractor-l8qct" Apr 16 18:19:12.275930 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.275913 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cf14f56c-5787-44f0-936c-3bb1bd030d2b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l8qct\" (UID: \"cf14f56c-5787-44f0-936c-3bb1bd030d2b\") " pod="openshift-insights/insights-runtime-extractor-l8qct" Apr 16 18:19:12.284632 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.284612 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bllr6\" (UniqueName: \"kubernetes.io/projected/cf14f56c-5787-44f0-936c-3bb1bd030d2b-kube-api-access-bllr6\") pod \"insights-runtime-extractor-l8qct\" (UID: \"cf14f56c-5787-44f0-936c-3bb1bd030d2b\") " pod="openshift-insights/insights-runtime-extractor-l8qct" Apr 16 18:19:12.377521 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.377495 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-l8qct" Apr 16 18:19:12.489847 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.489811 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-l8qct"] Apr 16 18:19:12.494199 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:19:12.494172 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf14f56c_5787_44f0_936c_3bb1bd030d2b.slice/crio-27d0a48fc3ec677250eccb7637c60a94b350aedd3ef79c87ac99f88ec3268c25 WatchSource:0}: Error finding container 27d0a48fc3ec677250eccb7637c60a94b350aedd3ef79c87ac99f88ec3268c25: Status 404 returned error can't find the container with id 27d0a48fc3ec677250eccb7637c60a94b350aedd3ef79c87ac99f88ec3268c25 Apr 16 18:19:12.566610 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.566584 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l8qct" event={"ID":"cf14f56c-5787-44f0-936c-3bb1bd030d2b","Type":"ContainerStarted","Data":"6b54ce03e1fa87f899d9d6666c8a692c3b650ad7107ad97c147c5804cc34c3a4"} Apr 16 18:19:12.566717 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:12.566619 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l8qct" event={"ID":"cf14f56c-5787-44f0-936c-3bb1bd030d2b","Type":"ContainerStarted","Data":"27d0a48fc3ec677250eccb7637c60a94b350aedd3ef79c87ac99f88ec3268c25"} Apr 16 18:19:13.570491 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:13.570417 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l8qct" event={"ID":"cf14f56c-5787-44f0-936c-3bb1bd030d2b","Type":"ContainerStarted","Data":"1049f38711ba2061f88abbaf909f6bbfd14aa5ca480a5b506a27480025c1a654"} Apr 16 18:19:13.885181 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:19:13.885138 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-whfl9" podUID="267cfa25-31fb-4ef1-af56-1f468ac12dc6" Apr 16 18:19:13.899734 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:19:13.899700 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-fqp4t" podUID="b8503a04-7aaa-49ef-bec9-fb099ecb0065" Apr 16 18:19:13.958658 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:13.958627 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" Apr 16 18:19:13.958658 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:13.958662 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" Apr 16 18:19:13.959040 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:13.959020 2573 scope.go:117] "RemoveContainer" containerID="660049fc0f4f5940b3195b39b9c7309a7a912fd03509a07929a7bb3b272315a8" Apr 16 18:19:13.959221 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:19:13.959196 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-dn4qx_openshift-console-operator(a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" podUID="a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0" Apr 16 18:19:14.135573 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:19:14.135477 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-sw2bl" podUID="a7930c14-4ef0-4949-a2ae-9a240da66c3c" Apr 16 18:19:14.573375 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:14.573305 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-whfl9" Apr 16 18:19:15.403914 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:15.403875 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/40926aa7-014a-4c73-95f1-c882be5b82a4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4czjs\" (UID: \"40926aa7-014a-4c73-95f1-c882be5b82a4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4czjs" Apr 16 18:19:15.406346 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:15.406310 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/40926aa7-014a-4c73-95f1-c882be5b82a4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4czjs\" (UID: \"40926aa7-014a-4c73-95f1-c882be5b82a4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4czjs" Apr 16 18:19:15.504706 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:15.504665 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-metrics-certs\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:19:15.504846 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:15.504729 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38c20e31-955b-4eb0-8e64-330c1b15b52e-service-ca-bundle\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:19:15.505232 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:15.505207 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38c20e31-955b-4eb0-8e64-330c1b15b52e-service-ca-bundle\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:19:15.506819 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:15.506793 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38c20e31-955b-4eb0-8e64-330c1b15b52e-metrics-certs\") pod \"router-default-6564b967f4-t9vbs\" (UID: \"38c20e31-955b-4eb0-8e64-330c1b15b52e\") " pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:19:15.577222 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:15.577192 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l8qct" event={"ID":"cf14f56c-5787-44f0-936c-3bb1bd030d2b","Type":"ContainerStarted","Data":"0b6687e5a6234378afc40aecbbf4b8508bd1de13d7f84f319bedbb1b4d1bf52f"} Apr 16 18:19:15.596027 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:15.595982 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-l8qct" podStartSLOduration=1.5247523969999999 podStartE2EDuration="3.595967105s" podCreationTimestamp="2026-04-16 18:19:12 +0000 UTC" firstStartedPulling="2026-04-16 18:19:12.545910271 +0000 UTC m=+154.053175372" lastFinishedPulling="2026-04-16 18:19:14.617124973 +0000 UTC m=+156.124390080" observedRunningTime="2026-04-16 18:19:15.594165469 +0000 UTC m=+157.101430584" watchObservedRunningTime="2026-04-16 18:19:15.595967105 +0000 UTC m=+157.103232220" Apr 16 18:19:15.650343 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:15.650311 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4czjs" Apr 16 18:19:15.761504 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:15.761469 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:19:15.761677 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:15.761525 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-4czjs"] Apr 16 18:19:15.764741 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:19:15.764716 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40926aa7_014a_4c73_95f1_c882be5b82a4.slice/crio-44938b8b112fcd17509900028e4a31e3df1954bcc003befed42de99bd9923622 WatchSource:0}: Error finding container 44938b8b112fcd17509900028e4a31e3df1954bcc003befed42de99bd9923622: Status 404 returned error can't find the container with id 44938b8b112fcd17509900028e4a31e3df1954bcc003befed42de99bd9923622 Apr 16 18:19:15.878313 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:15.878282 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6564b967f4-t9vbs"] Apr 16 18:19:15.881339 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:19:15.881310 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38c20e31_955b_4eb0_8e64_330c1b15b52e.slice/crio-a2c42994b3262f45be2d58cab07a4091ddfb015889819fbdcc4b908ec6e3fac2 WatchSource:0}: Error finding container a2c42994b3262f45be2d58cab07a4091ddfb015889819fbdcc4b908ec6e3fac2: Status 404 returned error can't find the container with id a2c42994b3262f45be2d58cab07a4091ddfb015889819fbdcc4b908ec6e3fac2 Apr 16 18:19:16.581347 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:16.581306 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4czjs" event={"ID":"40926aa7-014a-4c73-95f1-c882be5b82a4","Type":"ContainerStarted","Data":"44938b8b112fcd17509900028e4a31e3df1954bcc003befed42de99bd9923622"} Apr 16 18:19:16.582606 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:16.582569 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6564b967f4-t9vbs" event={"ID":"38c20e31-955b-4eb0-8e64-330c1b15b52e","Type":"ContainerStarted","Data":"47f782c9333b28e0dfe6ddff128fd8752f67af5530d0c8334c624ca981f6d970"} Apr 16 18:19:16.582718 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:16.582609 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6564b967f4-t9vbs" event={"ID":"38c20e31-955b-4eb0-8e64-330c1b15b52e","Type":"ContainerStarted","Data":"a2c42994b3262f45be2d58cab07a4091ddfb015889819fbdcc4b908ec6e3fac2"} Apr 16 18:19:16.602202 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:16.602155 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-6564b967f4-t9vbs" podStartSLOduration=33.602144284 podStartE2EDuration="33.602144284s" podCreationTimestamp="2026-04-16 18:18:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:19:16.601380394 +0000 UTC m=+158.108645515" watchObservedRunningTime="2026-04-16 18:19:16.602144284 +0000 UTC m=+158.109409400" Apr 16 18:19:16.761722 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:16.761693 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:19:16.764551 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:16.764526 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:19:17.586734 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:17.586693 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4czjs" event={"ID":"40926aa7-014a-4c73-95f1-c882be5b82a4","Type":"ContainerStarted","Data":"95b8b90e41bef07f3b0114e4285581a6f009eb4f54aca02c63281f44237b2cd0"} Apr 16 18:19:17.587203 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:17.586948 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:19:17.588237 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:17.588216 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-6564b967f4-t9vbs" Apr 16 18:19:17.604513 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:17.604464 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4czjs" podStartSLOduration=32.885052637 podStartE2EDuration="34.604450132s" podCreationTimestamp="2026-04-16 18:18:43 +0000 UTC" firstStartedPulling="2026-04-16 18:19:15.766467447 +0000 UTC m=+157.273732544" lastFinishedPulling="2026-04-16 18:19:17.485864931 +0000 UTC m=+158.993130039" observedRunningTime="2026-04-16 18:19:17.60413963 +0000 UTC m=+159.111404759" watchObservedRunningTime="2026-04-16 18:19:17.604450132 +0000 UTC m=+159.111715249" Apr 16 18:19:17.981132 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:17.981097 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wn88l"] Apr 16 18:19:17.984411 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:17.984393 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wn88l" Apr 16 18:19:17.986838 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:17.986815 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 18:19:17.987397 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:17.987382 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-g7xx4\"" Apr 16 18:19:17.990466 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:17.990436 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wn88l"] Apr 16 18:19:18.019340 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:18.019283 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c547cf5f-3757-43c6-aec3-a7da5a6b053e-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-wn88l\" (UID: \"c547cf5f-3757-43c6-aec3-a7da5a6b053e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wn88l" Apr 16 18:19:18.119841 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:18.119815 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c547cf5f-3757-43c6-aec3-a7da5a6b053e-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-wn88l\" (UID: \"c547cf5f-3757-43c6-aec3-a7da5a6b053e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wn88l" Apr 16 18:19:18.119971 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:19:18.119956 2573 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 18:19:18.120028 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:19:18.120019 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c547cf5f-3757-43c6-aec3-a7da5a6b053e-tls-certificates podName:c547cf5f-3757-43c6-aec3-a7da5a6b053e nodeName:}" failed. No retries permitted until 2026-04-16 18:19:18.62000263 +0000 UTC m=+160.127267728 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/c547cf5f-3757-43c6-aec3-a7da5a6b053e-tls-certificates") pod "prometheus-operator-admission-webhook-9cb97cd87-wn88l" (UID: "c547cf5f-3757-43c6-aec3-a7da5a6b053e") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 18:19:18.623272 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:18.623237 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c547cf5f-3757-43c6-aec3-a7da5a6b053e-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-wn88l\" (UID: \"c547cf5f-3757-43c6-aec3-a7da5a6b053e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wn88l" Apr 16 18:19:18.625568 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:18.625522 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c547cf5f-3757-43c6-aec3-a7da5a6b053e-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-wn88l\" (UID: \"c547cf5f-3757-43c6-aec3-a7da5a6b053e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wn88l" Apr 16 18:19:18.825218 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:18.825183 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert\") pod \"ingress-canary-fqp4t\" (UID: \"b8503a04-7aaa-49ef-bec9-fb099ecb0065\") " pod="openshift-ingress-canary/ingress-canary-fqp4t" Apr 16 18:19:18.825218 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:18.825222 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls\") pod \"dns-default-whfl9\" (UID: \"267cfa25-31fb-4ef1-af56-1f468ac12dc6\") " pod="openshift-dns/dns-default-whfl9" Apr 16 18:19:18.827392 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:18.827368 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/267cfa25-31fb-4ef1-af56-1f468ac12dc6-metrics-tls\") pod \"dns-default-whfl9\" (UID: \"267cfa25-31fb-4ef1-af56-1f468ac12dc6\") " pod="openshift-dns/dns-default-whfl9" Apr 16 18:19:18.827606 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:18.827586 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8503a04-7aaa-49ef-bec9-fb099ecb0065-cert\") pod \"ingress-canary-fqp4t\" (UID: \"b8503a04-7aaa-49ef-bec9-fb099ecb0065\") " pod="openshift-ingress-canary/ingress-canary-fqp4t" Apr 16 18:19:18.893784 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:18.893719 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wn88l" Apr 16 18:19:19.004512 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:19.004488 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wn88l"] Apr 16 18:19:19.006750 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:19:19.006723 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc547cf5f_3757_43c6_aec3_a7da5a6b053e.slice/crio-8426b2371ed20f525644524f31e963875f4d46bf856211f3a7a8bffe6cbb57b8 WatchSource:0}: Error finding container 8426b2371ed20f525644524f31e963875f4d46bf856211f3a7a8bffe6cbb57b8: Status 404 returned error can't find the container with id 8426b2371ed20f525644524f31e963875f4d46bf856211f3a7a8bffe6cbb57b8 Apr 16 18:19:19.076192 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:19.076163 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-gz4g7\"" Apr 16 18:19:19.084202 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:19.084182 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-whfl9" Apr 16 18:19:19.196908 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:19.196873 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-whfl9"] Apr 16 18:19:19.199434 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:19:19.199405 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod267cfa25_31fb_4ef1_af56_1f468ac12dc6.slice/crio-afca3ac75e63bd63396badfda83c5e27470948c128a0fff33a3a479d24314dd6 WatchSource:0}: Error finding container afca3ac75e63bd63396badfda83c5e27470948c128a0fff33a3a479d24314dd6: Status 404 returned error can't find the container with id afca3ac75e63bd63396badfda83c5e27470948c128a0fff33a3a479d24314dd6 Apr 16 18:19:19.593509 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:19.593387 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-whfl9" event={"ID":"267cfa25-31fb-4ef1-af56-1f468ac12dc6","Type":"ContainerStarted","Data":"afca3ac75e63bd63396badfda83c5e27470948c128a0fff33a3a479d24314dd6"} Apr 16 18:19:19.594469 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:19.594434 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wn88l" event={"ID":"c547cf5f-3757-43c6-aec3-a7da5a6b053e","Type":"ContainerStarted","Data":"8426b2371ed20f525644524f31e963875f4d46bf856211f3a7a8bffe6cbb57b8"} Apr 16 18:19:20.598177 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:20.598141 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wn88l" event={"ID":"c547cf5f-3757-43c6-aec3-a7da5a6b053e","Type":"ContainerStarted","Data":"d33277e155eb70c57cc34b7b226be4b795560e843913463632222d9d1d63a5f1"} Apr 16 18:19:20.598534 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:20.598347 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wn88l" Apr 16 18:19:20.603262 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:20.603243 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wn88l" Apr 16 18:19:20.614765 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:20.614728 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wn88l" podStartSLOduration=2.621634704 podStartE2EDuration="3.614716666s" podCreationTimestamp="2026-04-16 18:19:17 +0000 UTC" firstStartedPulling="2026-04-16 18:19:19.008488873 +0000 UTC m=+160.515753967" lastFinishedPulling="2026-04-16 18:19:20.001570827 +0000 UTC m=+161.508835929" observedRunningTime="2026-04-16 18:19:20.614139601 +0000 UTC m=+162.121404714" watchObservedRunningTime="2026-04-16 18:19:20.614716666 +0000 UTC m=+162.121981782" Apr 16 18:19:21.046259 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.046218 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-d7mmq"] Apr 16 18:19:21.049162 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.049145 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-d7mmq" Apr 16 18:19:21.051796 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.051775 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:19:21.052666 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.052644 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 18:19:21.052791 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.052642 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 18:19:21.052791 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.052645 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-dlcl4\"" Apr 16 18:19:21.060460 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.060438 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-d7mmq"] Apr 16 18:19:21.139950 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.139882 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc144f01-96a2-4b5e-bd4c-164a324f11de-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-d7mmq\" (UID: \"bc144f01-96a2-4b5e-bd4c-164a324f11de\") " pod="openshift-monitoring/prometheus-operator-78f957474d-d7mmq" Apr 16 18:19:21.139950 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.139915 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk9g9\" (UniqueName: \"kubernetes.io/projected/bc144f01-96a2-4b5e-bd4c-164a324f11de-kube-api-access-pk9g9\") pod \"prometheus-operator-78f957474d-d7mmq\" (UID: \"bc144f01-96a2-4b5e-bd4c-164a324f11de\") " pod="openshift-monitoring/prometheus-operator-78f957474d-d7mmq" Apr 16 18:19:21.139950 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.139939 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc144f01-96a2-4b5e-bd4c-164a324f11de-metrics-client-ca\") pod \"prometheus-operator-78f957474d-d7mmq\" (UID: \"bc144f01-96a2-4b5e-bd4c-164a324f11de\") " pod="openshift-monitoring/prometheus-operator-78f957474d-d7mmq" Apr 16 18:19:21.140180 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.140029 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bc144f01-96a2-4b5e-bd4c-164a324f11de-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-d7mmq\" (UID: \"bc144f01-96a2-4b5e-bd4c-164a324f11de\") " pod="openshift-monitoring/prometheus-operator-78f957474d-d7mmq" Apr 16 18:19:21.241038 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.241005 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc144f01-96a2-4b5e-bd4c-164a324f11de-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-d7mmq\" (UID: \"bc144f01-96a2-4b5e-bd4c-164a324f11de\") " pod="openshift-monitoring/prometheus-operator-78f957474d-d7mmq" Apr 16 18:19:21.241038 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.241038 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pk9g9\" (UniqueName: \"kubernetes.io/projected/bc144f01-96a2-4b5e-bd4c-164a324f11de-kube-api-access-pk9g9\") pod \"prometheus-operator-78f957474d-d7mmq\" (UID: \"bc144f01-96a2-4b5e-bd4c-164a324f11de\") " pod="openshift-monitoring/prometheus-operator-78f957474d-d7mmq" Apr 16 18:19:21.241236 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.241059 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc144f01-96a2-4b5e-bd4c-164a324f11de-metrics-client-ca\") pod \"prometheus-operator-78f957474d-d7mmq\" (UID: \"bc144f01-96a2-4b5e-bd4c-164a324f11de\") " pod="openshift-monitoring/prometheus-operator-78f957474d-d7mmq" Apr 16 18:19:21.241236 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.241101 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bc144f01-96a2-4b5e-bd4c-164a324f11de-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-d7mmq\" (UID: \"bc144f01-96a2-4b5e-bd4c-164a324f11de\") " pod="openshift-monitoring/prometheus-operator-78f957474d-d7mmq" Apr 16 18:19:21.241804 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.241772 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc144f01-96a2-4b5e-bd4c-164a324f11de-metrics-client-ca\") pod \"prometheus-operator-78f957474d-d7mmq\" (UID: \"bc144f01-96a2-4b5e-bd4c-164a324f11de\") " pod="openshift-monitoring/prometheus-operator-78f957474d-d7mmq" Apr 16 18:19:21.243425 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.243396 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc144f01-96a2-4b5e-bd4c-164a324f11de-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-d7mmq\" (UID: \"bc144f01-96a2-4b5e-bd4c-164a324f11de\") " pod="openshift-monitoring/prometheus-operator-78f957474d-d7mmq" Apr 16 18:19:21.243425 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.243421 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bc144f01-96a2-4b5e-bd4c-164a324f11de-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-d7mmq\" (UID: \"bc144f01-96a2-4b5e-bd4c-164a324f11de\") " pod="openshift-monitoring/prometheus-operator-78f957474d-d7mmq" Apr 16 18:19:21.248574 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.248539 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk9g9\" (UniqueName: \"kubernetes.io/projected/bc144f01-96a2-4b5e-bd4c-164a324f11de-kube-api-access-pk9g9\") pod \"prometheus-operator-78f957474d-d7mmq\" (UID: \"bc144f01-96a2-4b5e-bd4c-164a324f11de\") " pod="openshift-monitoring/prometheus-operator-78f957474d-d7mmq" Apr 16 18:19:21.357465 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.357395 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-d7mmq" Apr 16 18:19:21.467110 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.467077 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-d7mmq"] Apr 16 18:19:21.469699 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:19:21.469666 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc144f01_96a2_4b5e_bd4c_164a324f11de.slice/crio-f1fc60166dbd83615a0958d6ad97ee4fd2811f6e15ce049aa22df8fb64a88fcb WatchSource:0}: Error finding container f1fc60166dbd83615a0958d6ad97ee4fd2811f6e15ce049aa22df8fb64a88fcb: Status 404 returned error can't find the container with id f1fc60166dbd83615a0958d6ad97ee4fd2811f6e15ce049aa22df8fb64a88fcb Apr 16 18:19:21.602412 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.602380 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-d7mmq" event={"ID":"bc144f01-96a2-4b5e-bd4c-164a324f11de","Type":"ContainerStarted","Data":"f1fc60166dbd83615a0958d6ad97ee4fd2811f6e15ce049aa22df8fb64a88fcb"} Apr 16 18:19:21.603888 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.603864 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-whfl9" event={"ID":"267cfa25-31fb-4ef1-af56-1f468ac12dc6","Type":"ContainerStarted","Data":"ba1a06c5c2e8e845ef1d0100d16f0bc411b095131c1a05dfbacf67f27ba59a08"} Apr 16 18:19:21.603888 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.603892 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-whfl9" event={"ID":"267cfa25-31fb-4ef1-af56-1f468ac12dc6","Type":"ContainerStarted","Data":"bbe4a2b52fd2a31a0dad510a484ec190ece5a4ebc292b5e60cbf1033df319e36"} Apr 16 18:19:21.620619 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:21.620579 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-whfl9" podStartSLOduration=130.307569794 podStartE2EDuration="2m11.620566891s" podCreationTimestamp="2026-04-16 18:17:10 +0000 UTC" firstStartedPulling="2026-04-16 18:19:19.201445445 +0000 UTC m=+160.708710542" lastFinishedPulling="2026-04-16 18:19:20.514442545 +0000 UTC m=+162.021707639" observedRunningTime="2026-04-16 18:19:21.619729702 +0000 UTC m=+163.126994832" watchObservedRunningTime="2026-04-16 18:19:21.620566891 +0000 UTC m=+163.127831998" Apr 16 18:19:22.607060 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:22.607030 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-whfl9" Apr 16 18:19:23.613959 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:23.613924 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-d7mmq" event={"ID":"bc144f01-96a2-4b5e-bd4c-164a324f11de","Type":"ContainerStarted","Data":"b97d25cf140d3d10a10420d36e4204d495d52f75b8d05a58abbc957c37da186a"} Apr 16 18:19:23.613959 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:23.613963 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-d7mmq" event={"ID":"bc144f01-96a2-4b5e-bd4c-164a324f11de","Type":"ContainerStarted","Data":"3166d5e4643c84f0345fece49d78608ce03dad8646f116fd81c0a343c2f3e0c8"} Apr 16 18:19:23.631020 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:23.630977 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-d7mmq" podStartSLOduration=1.4361773420000001 podStartE2EDuration="2.630964044s" podCreationTimestamp="2026-04-16 18:19:21 +0000 UTC" firstStartedPulling="2026-04-16 18:19:21.471494033 +0000 UTC m=+162.978759127" lastFinishedPulling="2026-04-16 18:19:22.666280734 +0000 UTC m=+164.173545829" observedRunningTime="2026-04-16 18:19:23.629494443 +0000 UTC m=+165.136759558" watchObservedRunningTime="2026-04-16 18:19:23.630964044 +0000 UTC m=+165.138229166" Apr 16 18:19:25.401268 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.401225 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-5mxgb"] Apr 16 18:19:25.404877 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.404856 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-5mxgb" Apr 16 18:19:25.407477 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.407449 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:19:25.407615 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.407520 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-626pp\"" Apr 16 18:19:25.407680 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.407530 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 18:19:25.415484 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.415461 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-5mxgb"] Apr 16 18:19:25.432757 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.432732 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-8bbb7"] Apr 16 18:19:25.435889 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.435869 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.438589 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.438569 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:19:25.438701 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.438612 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-gmsbp\"" Apr 16 18:19:25.438701 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.438638 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:19:25.438970 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.438953 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:19:25.473822 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.473791 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/85fa889e-121e-49ab-b5e9-49f2f731ad8b-node-exporter-tls\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.473952 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.473837 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/85fa889e-121e-49ab-b5e9-49f2f731ad8b-node-exporter-textfile\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.473952 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.473889 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/85fa889e-121e-49ab-b5e9-49f2f731ad8b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.473952 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.473924 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/85fa889e-121e-49ab-b5e9-49f2f731ad8b-metrics-client-ca\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.473952 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.473949 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/85fa889e-121e-49ab-b5e9-49f2f731ad8b-node-exporter-accelerators-collector-config\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.474104 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.473966 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-5mxgb\" (UID: \"bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5mxgb" Apr 16 18:19:25.474104 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.473984 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-5mxgb\" (UID: \"bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5mxgb" Apr 16 18:19:25.474104 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.474016 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/85fa889e-121e-49ab-b5e9-49f2f731ad8b-node-exporter-wtmp\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.474104 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.474039 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fngn5\" (UniqueName: \"kubernetes.io/projected/85fa889e-121e-49ab-b5e9-49f2f731ad8b-kube-api-access-fngn5\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.474104 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.474092 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-5mxgb\" (UID: \"bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5mxgb" Apr 16 18:19:25.474262 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.474124 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/85fa889e-121e-49ab-b5e9-49f2f731ad8b-sys\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.474262 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.474147 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/85fa889e-121e-49ab-b5e9-49f2f731ad8b-root\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.474262 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.474185 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnwql\" (UniqueName: \"kubernetes.io/projected/bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99-kube-api-access-tnwql\") pod \"openshift-state-metrics-5669946b84-5mxgb\" (UID: \"bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5mxgb" Apr 16 18:19:25.574517 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.574488 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/85fa889e-121e-49ab-b5e9-49f2f731ad8b-node-exporter-tls\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.574517 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.574526 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/85fa889e-121e-49ab-b5e9-49f2f731ad8b-node-exporter-textfile\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.574769 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.574573 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/85fa889e-121e-49ab-b5e9-49f2f731ad8b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.574769 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.574678 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/85fa889e-121e-49ab-b5e9-49f2f731ad8b-metrics-client-ca\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.574769 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.574735 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/85fa889e-121e-49ab-b5e9-49f2f731ad8b-node-exporter-accelerators-collector-config\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.574769 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.574765 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-5mxgb\" (UID: \"bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5mxgb" Apr 16 18:19:25.574962 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.574794 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-5mxgb\" (UID: \"bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5mxgb" Apr 16 18:19:25.574962 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.574844 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/85fa889e-121e-49ab-b5e9-49f2f731ad8b-node-exporter-wtmp\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.574962 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.574869 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fngn5\" (UniqueName: \"kubernetes.io/projected/85fa889e-121e-49ab-b5e9-49f2f731ad8b-kube-api-access-fngn5\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.574962 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.574915 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-5mxgb\" (UID: \"bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5mxgb" Apr 16 18:19:25.574962 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.574930 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/85fa889e-121e-49ab-b5e9-49f2f731ad8b-node-exporter-textfile\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.575178 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.574962 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/85fa889e-121e-49ab-b5e9-49f2f731ad8b-sys\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.575178 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.574999 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/85fa889e-121e-49ab-b5e9-49f2f731ad8b-root\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.575178 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.575024 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnwql\" (UniqueName: \"kubernetes.io/projected/bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99-kube-api-access-tnwql\") pod \"openshift-state-metrics-5669946b84-5mxgb\" (UID: \"bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5mxgb" Apr 16 18:19:25.575386 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.575366 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/85fa889e-121e-49ab-b5e9-49f2f731ad8b-node-exporter-wtmp\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.575449 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.575428 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/85fa889e-121e-49ab-b5e9-49f2f731ad8b-sys\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.575503 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.575474 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/85fa889e-121e-49ab-b5e9-49f2f731ad8b-root\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.575659 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.575603 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-5mxgb\" (UID: \"bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5mxgb" Apr 16 18:19:25.575993 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.575962 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/85fa889e-121e-49ab-b5e9-49f2f731ad8b-metrics-client-ca\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.576078 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.576003 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/85fa889e-121e-49ab-b5e9-49f2f731ad8b-node-exporter-accelerators-collector-config\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.577479 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.577448 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/85fa889e-121e-49ab-b5e9-49f2f731ad8b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.577584 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.577459 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/85fa889e-121e-49ab-b5e9-49f2f731ad8b-node-exporter-tls\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.577584 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.577503 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-5mxgb\" (UID: \"bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5mxgb" Apr 16 18:19:25.577710 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.577689 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-5mxgb\" (UID: \"bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5mxgb" Apr 16 18:19:25.583098 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.583074 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fngn5\" (UniqueName: \"kubernetes.io/projected/85fa889e-121e-49ab-b5e9-49f2f731ad8b-kube-api-access-fngn5\") pod \"node-exporter-8bbb7\" (UID: \"85fa889e-121e-49ab-b5e9-49f2f731ad8b\") " pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.583194 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.583174 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnwql\" (UniqueName: \"kubernetes.io/projected/bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99-kube-api-access-tnwql\") pod \"openshift-state-metrics-5669946b84-5mxgb\" (UID: \"bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5mxgb" Apr 16 18:19:25.714987 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.714920 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-5mxgb" Apr 16 18:19:25.746197 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.746167 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8bbb7" Apr 16 18:19:25.755329 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:19:25.755303 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85fa889e_121e_49ab_b5e9_49f2f731ad8b.slice/crio-386cb6bfed0e14242da424f754f0bb790db37d1b1af2515fe222230b9be4745d WatchSource:0}: Error finding container 386cb6bfed0e14242da424f754f0bb790db37d1b1af2515fe222230b9be4745d: Status 404 returned error can't find the container with id 386cb6bfed0e14242da424f754f0bb790db37d1b1af2515fe222230b9be4745d Apr 16 18:19:25.832893 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:25.832864 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-5mxgb"] Apr 16 18:19:25.836380 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:19:25.836355 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd4a3eb9_d8d8_4957_b1e4_87fc2e796d99.slice/crio-2852cacc7a68d116ead14dc691361b6d73e0d99a22e612d559441d38cb9327e9 WatchSource:0}: Error finding container 2852cacc7a68d116ead14dc691361b6d73e0d99a22e612d559441d38cb9327e9: Status 404 returned error can't find the container with id 2852cacc7a68d116ead14dc691361b6d73e0d99a22e612d559441d38cb9327e9 Apr 16 18:19:26.624664 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:26.624518 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-5mxgb" event={"ID":"bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99","Type":"ContainerStarted","Data":"7b1a97750c01d9fd7ed66922b3543143a5b565d61c3bed5e53c3db2e5b7a64d6"} Apr 16 18:19:26.624664 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:26.624623 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-5mxgb" event={"ID":"bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99","Type":"ContainerStarted","Data":"9e8675638205aceaaab3e2847de58701dcb2a6e693ef8f2a4d09aa6507cfbcbe"} Apr 16 18:19:26.624664 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:26.624642 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-5mxgb" event={"ID":"bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99","Type":"ContainerStarted","Data":"2852cacc7a68d116ead14dc691361b6d73e0d99a22e612d559441d38cb9327e9"} Apr 16 18:19:26.627039 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:26.626249 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8bbb7" event={"ID":"85fa889e-121e-49ab-b5e9-49f2f731ad8b","Type":"ContainerStarted","Data":"dbf5aef7b96f400a654a0491591aa5f467d9ccc42463c18e21c51193576edd47"} Apr 16 18:19:26.627039 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:26.626287 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8bbb7" event={"ID":"85fa889e-121e-49ab-b5e9-49f2f731ad8b","Type":"ContainerStarted","Data":"386cb6bfed0e14242da424f754f0bb790db37d1b1af2515fe222230b9be4745d"} Apr 16 18:19:27.116920 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.116885 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fqp4t" Apr 16 18:19:27.117531 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.117503 2573 scope.go:117] "RemoveContainer" containerID="660049fc0f4f5940b3195b39b9c7309a7a912fd03509a07929a7bb3b272315a8" Apr 16 18:19:27.117802 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:19:27.117777 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-dn4qx_openshift-console-operator(a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" podUID="a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0" Apr 16 18:19:27.119391 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.119365 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4b92n\"" Apr 16 18:19:27.127499 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.127480 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fqp4t" Apr 16 18:19:27.249192 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.249167 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fqp4t"] Apr 16 18:19:27.251770 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:19:27.251746 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8503a04_7aaa_49ef_bec9_fb099ecb0065.slice/crio-7d6f8adc066b71be2eec1a708cf17e6259caa959ab4b6d63ee5820662d7f03db WatchSource:0}: Error finding container 7d6f8adc066b71be2eec1a708cf17e6259caa959ab4b6d63ee5820662d7f03db: Status 404 returned error can't find the container with id 7d6f8adc066b71be2eec1a708cf17e6259caa959ab4b6d63ee5820662d7f03db Apr 16 18:19:27.422163 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.422135 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-56d858f7df-dh959"] Apr 16 18:19:27.428624 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.428599 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.431252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.431225 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 18:19:27.431252 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.431230 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 18:19:27.431468 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.431313 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 18:19:27.431468 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.431314 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-ahm555ntl5sha\"" Apr 16 18:19:27.431726 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.431662 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 18:19:27.431726 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.431716 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-z5rkg\"" Apr 16 18:19:27.432324 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.432307 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 18:19:27.438378 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.438359 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-56d858f7df-dh959"] Apr 16 18:19:27.494193 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.494171 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/87a9b265-79da-4fbd-8c71-83ff25214c57-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.494193 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.494202 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/87a9b265-79da-4fbd-8c71-83ff25214c57-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.494361 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.494230 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/87a9b265-79da-4fbd-8c71-83ff25214c57-secret-grpc-tls\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.494361 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.494256 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2459b\" (UniqueName: \"kubernetes.io/projected/87a9b265-79da-4fbd-8c71-83ff25214c57-kube-api-access-2459b\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.494361 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.494323 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/87a9b265-79da-4fbd-8c71-83ff25214c57-secret-thanos-querier-tls\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.494361 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.494344 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87a9b265-79da-4fbd-8c71-83ff25214c57-metrics-client-ca\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.494504 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.494364 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/87a9b265-79da-4fbd-8c71-83ff25214c57-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.494504 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.494378 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/87a9b265-79da-4fbd-8c71-83ff25214c57-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.595595 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.595523 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/87a9b265-79da-4fbd-8c71-83ff25214c57-secret-thanos-querier-tls\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.595595 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.595592 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87a9b265-79da-4fbd-8c71-83ff25214c57-metrics-client-ca\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.595812 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.595624 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/87a9b265-79da-4fbd-8c71-83ff25214c57-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.595812 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.595646 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/87a9b265-79da-4fbd-8c71-83ff25214c57-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.595812 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.595689 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/87a9b265-79da-4fbd-8c71-83ff25214c57-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.595812 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.595720 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/87a9b265-79da-4fbd-8c71-83ff25214c57-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.595812 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.595769 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/87a9b265-79da-4fbd-8c71-83ff25214c57-secret-grpc-tls\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.595812 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.595795 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2459b\" (UniqueName: \"kubernetes.io/projected/87a9b265-79da-4fbd-8c71-83ff25214c57-kube-api-access-2459b\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.596966 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.596944 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87a9b265-79da-4fbd-8c71-83ff25214c57-metrics-client-ca\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.598924 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.598876 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/87a9b265-79da-4fbd-8c71-83ff25214c57-secret-thanos-querier-tls\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.599029 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.598923 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/87a9b265-79da-4fbd-8c71-83ff25214c57-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.599097 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.599027 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/87a9b265-79da-4fbd-8c71-83ff25214c57-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.599433 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.599411 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/87a9b265-79da-4fbd-8c71-83ff25214c57-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.599529 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.599459 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/87a9b265-79da-4fbd-8c71-83ff25214c57-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.600225 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.600204 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/87a9b265-79da-4fbd-8c71-83ff25214c57-secret-grpc-tls\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.605008 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.604989 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2459b\" (UniqueName: \"kubernetes.io/projected/87a9b265-79da-4fbd-8c71-83ff25214c57-kube-api-access-2459b\") pod \"thanos-querier-56d858f7df-dh959\" (UID: \"87a9b265-79da-4fbd-8c71-83ff25214c57\") " pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.631142 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.631094 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-5mxgb" event={"ID":"bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99","Type":"ContainerStarted","Data":"549f411c80823302718643488231d6e11d2bf52eea7b4d26fb6312da920fe502"} Apr 16 18:19:27.632651 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.632627 2573 generic.go:358] "Generic (PLEG): container finished" podID="85fa889e-121e-49ab-b5e9-49f2f731ad8b" containerID="dbf5aef7b96f400a654a0491591aa5f467d9ccc42463c18e21c51193576edd47" exitCode=0 Apr 16 18:19:27.632922 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.632702 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8bbb7" event={"ID":"85fa889e-121e-49ab-b5e9-49f2f731ad8b","Type":"ContainerDied","Data":"dbf5aef7b96f400a654a0491591aa5f467d9ccc42463c18e21c51193576edd47"} Apr 16 18:19:27.633915 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.633887 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fqp4t" event={"ID":"b8503a04-7aaa-49ef-bec9-fb099ecb0065","Type":"ContainerStarted","Data":"7d6f8adc066b71be2eec1a708cf17e6259caa959ab4b6d63ee5820662d7f03db"} Apr 16 18:19:27.650845 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.650403 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-5mxgb" podStartSLOduration=1.420419051 podStartE2EDuration="2.650387202s" podCreationTimestamp="2026-04-16 18:19:25 +0000 UTC" firstStartedPulling="2026-04-16 18:19:25.949486917 +0000 UTC m=+167.456752012" lastFinishedPulling="2026-04-16 18:19:27.179455053 +0000 UTC m=+168.686720163" observedRunningTime="2026-04-16 18:19:27.649537861 +0000 UTC m=+169.156802977" watchObservedRunningTime="2026-04-16 18:19:27.650387202 +0000 UTC m=+169.157652318" Apr 16 18:19:27.737670 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.737647 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:27.893780 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:27.893728 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-56d858f7df-dh959"] Apr 16 18:19:27.896988 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:19:27.896948 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87a9b265_79da_4fbd_8c71_83ff25214c57.slice/crio-6a507f662a8c40b26c8fff4679a174d75591a9374c22d476186ca1aaaf92a90c WatchSource:0}: Error finding container 6a507f662a8c40b26c8fff4679a174d75591a9374c22d476186ca1aaaf92a90c: Status 404 returned error can't find the container with id 6a507f662a8c40b26c8fff4679a174d75591a9374c22d476186ca1aaaf92a90c Apr 16 18:19:28.117120 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:28.117044 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:19:28.639625 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:28.639582 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" event={"ID":"87a9b265-79da-4fbd-8c71-83ff25214c57","Type":"ContainerStarted","Data":"6a507f662a8c40b26c8fff4679a174d75591a9374c22d476186ca1aaaf92a90c"} Apr 16 18:19:28.641770 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:28.641737 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8bbb7" event={"ID":"85fa889e-121e-49ab-b5e9-49f2f731ad8b","Type":"ContainerStarted","Data":"5dd71a90c1aa48ff2c66282d1f1515d504fd274c61e26ed6c89ed101261f0da9"} Apr 16 18:19:28.641903 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:28.641790 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8bbb7" event={"ID":"85fa889e-121e-49ab-b5e9-49f2f731ad8b","Type":"ContainerStarted","Data":"de41f9f4482baf9ee46b57b4c5deb1db05a6e565e75352abafac4fde6cafe3ea"} Apr 16 18:19:28.660718 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:28.660659 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-8bbb7" podStartSLOduration=2.932548246 podStartE2EDuration="3.660637469s" podCreationTimestamp="2026-04-16 18:19:25 +0000 UTC" firstStartedPulling="2026-04-16 18:19:25.756978749 +0000 UTC m=+167.264243844" lastFinishedPulling="2026-04-16 18:19:26.485067965 +0000 UTC m=+167.992333067" observedRunningTime="2026-04-16 18:19:28.659788383 +0000 UTC m=+170.167053499" watchObservedRunningTime="2026-04-16 18:19:28.660637469 +0000 UTC m=+170.167902587" Apr 16 18:19:29.646082 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:29.646044 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fqp4t" event={"ID":"b8503a04-7aaa-49ef-bec9-fb099ecb0065","Type":"ContainerStarted","Data":"6555767b70d16860b2200616abfa10d3014609fd8f162f768c540141d8dc73a8"} Apr 16 18:19:29.661025 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:29.660979 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-fqp4t" podStartSLOduration=138.1675285 podStartE2EDuration="2m19.660966268s" podCreationTimestamp="2026-04-16 18:17:10 +0000 UTC" firstStartedPulling="2026-04-16 18:19:27.25391751 +0000 UTC m=+168.761182604" lastFinishedPulling="2026-04-16 18:19:28.747355273 +0000 UTC m=+170.254620372" observedRunningTime="2026-04-16 18:19:29.660873271 +0000 UTC m=+171.168138390" watchObservedRunningTime="2026-04-16 18:19:29.660966268 +0000 UTC m=+171.168231384" Apr 16 18:19:30.651099 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:30.651069 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" event={"ID":"87a9b265-79da-4fbd-8c71-83ff25214c57","Type":"ContainerStarted","Data":"09b3f37338742a7c9ba7f6807db7a4184adb28bfa99be0d1c2988a0b7d186b20"} Apr 16 18:19:30.651099 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:30.651106 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" event={"ID":"87a9b265-79da-4fbd-8c71-83ff25214c57","Type":"ContainerStarted","Data":"2c654425e2f0e37a63eeed83394d3c9ccc1d493bf734ce30b9ec34331e0fe389"} Apr 16 18:19:30.651506 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:30.651115 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" event={"ID":"87a9b265-79da-4fbd-8c71-83ff25214c57","Type":"ContainerStarted","Data":"d300c7fb7dedb48289967d435b12d9ae437fc5fcdbdc0f1e5d46dd6bc15432fb"} Apr 16 18:19:31.657401 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:31.657360 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" event={"ID":"87a9b265-79da-4fbd-8c71-83ff25214c57","Type":"ContainerStarted","Data":"f724951bc9f26c84c7e042cf275f6a4e93ca1dc8737976cdaf0f907e9db673fb"} Apr 16 18:19:31.657401 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:31.657403 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" event={"ID":"87a9b265-79da-4fbd-8c71-83ff25214c57","Type":"ContainerStarted","Data":"838d1ff880774972bf2451cc3dd789275ce992b143645c09c9834034070dc265"} Apr 16 18:19:31.657927 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:31.657416 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" event={"ID":"87a9b265-79da-4fbd-8c71-83ff25214c57","Type":"ContainerStarted","Data":"42f71103c77796ecdbc2635e38f52fbcb53962d7e84ed941620bceabf23bb68c"} Apr 16 18:19:31.657927 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:31.657539 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:31.681887 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:31.681838 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" podStartSLOduration=1.261995594 podStartE2EDuration="4.68182424s" podCreationTimestamp="2026-04-16 18:19:27 +0000 UTC" firstStartedPulling="2026-04-16 18:19:27.899195373 +0000 UTC m=+169.406460468" lastFinishedPulling="2026-04-16 18:19:31.319024003 +0000 UTC m=+172.826289114" observedRunningTime="2026-04-16 18:19:31.680326557 +0000 UTC m=+173.187591673" watchObservedRunningTime="2026-04-16 18:19:31.68182424 +0000 UTC m=+173.189089355" Apr 16 18:19:32.615682 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:32.615651 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-whfl9" Apr 16 18:19:37.666110 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:37.666081 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-56d858f7df-dh959" Apr 16 18:19:42.117935 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:42.117901 2573 scope.go:117] "RemoveContainer" containerID="660049fc0f4f5940b3195b39b9c7309a7a912fd03509a07929a7bb3b272315a8" Apr 16 18:19:42.688786 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:42.688759 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-dn4qx_a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0/console-operator/2.log" Apr 16 18:19:42.688973 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:42.688864 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" event={"ID":"a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0","Type":"ContainerStarted","Data":"97f269ad21f797d116f8f16a1c99e468028ab4d271055a367742fa0f5e7aa244"} Apr 16 18:19:42.689140 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:42.689119 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" Apr 16 18:19:42.707445 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:42.707407 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" podStartSLOduration=57.286013631 podStartE2EDuration="59.707397148s" podCreationTimestamp="2026-04-16 18:18:43 +0000 UTC" firstStartedPulling="2026-04-16 18:18:44.09803854 +0000 UTC m=+125.605303635" lastFinishedPulling="2026-04-16 18:18:46.519422058 +0000 UTC m=+128.026687152" observedRunningTime="2026-04-16 18:19:42.707242052 +0000 UTC m=+184.214507167" watchObservedRunningTime="2026-04-16 18:19:42.707397148 +0000 UTC m=+184.214662264" Apr 16 18:19:43.499188 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:43.499162 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-dn4qx" Apr 16 18:19:43.671481 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:43.671443 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-bmlww"] Apr 16 18:19:43.674935 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:43.674918 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-bmlww" Apr 16 18:19:43.677441 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:43.677417 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-nzxfm\"" Apr 16 18:19:43.677583 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:43.677422 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:19:43.677583 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:43.677422 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:19:43.683680 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:43.683661 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-bmlww"] Apr 16 18:19:43.733107 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:43.733067 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzdmw\" (UniqueName: \"kubernetes.io/projected/635733b3-ada2-4619-ac36-00ecd1081e1e-kube-api-access-hzdmw\") pod \"downloads-586b57c7b4-bmlww\" (UID: \"635733b3-ada2-4619-ac36-00ecd1081e1e\") " pod="openshift-console/downloads-586b57c7b4-bmlww" Apr 16 18:19:43.834207 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:43.834133 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hzdmw\" (UniqueName: \"kubernetes.io/projected/635733b3-ada2-4619-ac36-00ecd1081e1e-kube-api-access-hzdmw\") pod \"downloads-586b57c7b4-bmlww\" (UID: \"635733b3-ada2-4619-ac36-00ecd1081e1e\") " pod="openshift-console/downloads-586b57c7b4-bmlww" Apr 16 18:19:43.842155 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:43.842125 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzdmw\" (UniqueName: \"kubernetes.io/projected/635733b3-ada2-4619-ac36-00ecd1081e1e-kube-api-access-hzdmw\") pod \"downloads-586b57c7b4-bmlww\" (UID: \"635733b3-ada2-4619-ac36-00ecd1081e1e\") " pod="openshift-console/downloads-586b57c7b4-bmlww" Apr 16 18:19:43.983878 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:43.983847 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-bmlww" Apr 16 18:19:44.097055 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:44.097009 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-bmlww"] Apr 16 18:19:44.099579 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:19:44.099517 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod635733b3_ada2_4619_ac36_00ecd1081e1e.slice/crio-3047bc4d10efc67569b61e8090573fb04a4717ea200883c576d9ed6ce8a2e2b3 WatchSource:0}: Error finding container 3047bc4d10efc67569b61e8090573fb04a4717ea200883c576d9ed6ce8a2e2b3: Status 404 returned error can't find the container with id 3047bc4d10efc67569b61e8090573fb04a4717ea200883c576d9ed6ce8a2e2b3 Apr 16 18:19:44.696078 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:44.696037 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-bmlww" event={"ID":"635733b3-ada2-4619-ac36-00ecd1081e1e","Type":"ContainerStarted","Data":"3047bc4d10efc67569b61e8090573fb04a4717ea200883c576d9ed6ce8a2e2b3"} Apr 16 18:19:49.799423 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.799389 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-766767b746-px5db"] Apr 16 18:19:49.803166 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.803143 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-766767b746-px5db" Apr 16 18:19:49.805624 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.805598 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:19:49.806625 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.806604 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:19:49.806742 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.806644 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:19:49.806742 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.806607 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:19:49.806742 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.806679 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-cn4xh\"" Apr 16 18:19:49.806901 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.806807 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:19:49.816495 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.816469 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-766767b746-px5db"] Apr 16 18:19:49.889482 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.889451 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c072bf5a-1704-4832-a159-5d9feaf10bf4-oauth-serving-cert\") pod \"console-766767b746-px5db\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " pod="openshift-console/console-766767b746-px5db" Apr 16 18:19:49.889675 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.889569 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c072bf5a-1704-4832-a159-5d9feaf10bf4-service-ca\") pod \"console-766767b746-px5db\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " pod="openshift-console/console-766767b746-px5db" Apr 16 18:19:49.889675 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.889613 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c072bf5a-1704-4832-a159-5d9feaf10bf4-console-oauth-config\") pod \"console-766767b746-px5db\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " pod="openshift-console/console-766767b746-px5db" Apr 16 18:19:49.889675 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.889641 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c072bf5a-1704-4832-a159-5d9feaf10bf4-console-config\") pod \"console-766767b746-px5db\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " pod="openshift-console/console-766767b746-px5db" Apr 16 18:19:49.889675 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.889670 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c072bf5a-1704-4832-a159-5d9feaf10bf4-console-serving-cert\") pod \"console-766767b746-px5db\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " pod="openshift-console/console-766767b746-px5db" Apr 16 18:19:49.889868 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.889694 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtknt\" (UniqueName: \"kubernetes.io/projected/c072bf5a-1704-4832-a159-5d9feaf10bf4-kube-api-access-xtknt\") pod \"console-766767b746-px5db\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " pod="openshift-console/console-766767b746-px5db" Apr 16 18:19:49.990985 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.990952 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c072bf5a-1704-4832-a159-5d9feaf10bf4-oauth-serving-cert\") pod \"console-766767b746-px5db\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " pod="openshift-console/console-766767b746-px5db" Apr 16 18:19:49.991167 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.991020 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c072bf5a-1704-4832-a159-5d9feaf10bf4-service-ca\") pod \"console-766767b746-px5db\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " pod="openshift-console/console-766767b746-px5db" Apr 16 18:19:49.991167 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.991060 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c072bf5a-1704-4832-a159-5d9feaf10bf4-console-oauth-config\") pod \"console-766767b746-px5db\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " pod="openshift-console/console-766767b746-px5db" Apr 16 18:19:49.991285 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.991169 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c072bf5a-1704-4832-a159-5d9feaf10bf4-console-config\") pod \"console-766767b746-px5db\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " pod="openshift-console/console-766767b746-px5db" Apr 16 18:19:49.991340 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.991313 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c072bf5a-1704-4832-a159-5d9feaf10bf4-console-serving-cert\") pod \"console-766767b746-px5db\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " pod="openshift-console/console-766767b746-px5db" Apr 16 18:19:49.991393 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.991345 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xtknt\" (UniqueName: \"kubernetes.io/projected/c072bf5a-1704-4832-a159-5d9feaf10bf4-kube-api-access-xtknt\") pod \"console-766767b746-px5db\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " pod="openshift-console/console-766767b746-px5db" Apr 16 18:19:49.991835 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.991810 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c072bf5a-1704-4832-a159-5d9feaf10bf4-oauth-serving-cert\") pod \"console-766767b746-px5db\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " pod="openshift-console/console-766767b746-px5db" Apr 16 18:19:49.991971 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.991950 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c072bf5a-1704-4832-a159-5d9feaf10bf4-console-config\") pod \"console-766767b746-px5db\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " pod="openshift-console/console-766767b746-px5db" Apr 16 18:19:49.992029 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.992011 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c072bf5a-1704-4832-a159-5d9feaf10bf4-service-ca\") pod \"console-766767b746-px5db\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " pod="openshift-console/console-766767b746-px5db" Apr 16 18:19:49.994152 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.994132 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c072bf5a-1704-4832-a159-5d9feaf10bf4-console-oauth-config\") pod \"console-766767b746-px5db\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " pod="openshift-console/console-766767b746-px5db" Apr 16 18:19:49.994244 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.994156 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c072bf5a-1704-4832-a159-5d9feaf10bf4-console-serving-cert\") pod \"console-766767b746-px5db\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " pod="openshift-console/console-766767b746-px5db" Apr 16 18:19:49.999964 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:49.999944 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtknt\" (UniqueName: \"kubernetes.io/projected/c072bf5a-1704-4832-a159-5d9feaf10bf4-kube-api-access-xtknt\") pod \"console-766767b746-px5db\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " pod="openshift-console/console-766767b746-px5db" Apr 16 18:19:50.115176 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:50.115096 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-766767b746-px5db" Apr 16 18:19:50.247696 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:50.247663 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-766767b746-px5db"] Apr 16 18:19:50.251078 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:19:50.250947 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc072bf5a_1704_4832_a159_5d9feaf10bf4.slice/crio-e07e697a14835a50a40516de2d2a63f4d4224ef0b9f6b37ee2965feebf3a8436 WatchSource:0}: Error finding container e07e697a14835a50a40516de2d2a63f4d4224ef0b9f6b37ee2965feebf3a8436: Status 404 returned error can't find the container with id e07e697a14835a50a40516de2d2a63f4d4224ef0b9f6b37ee2965feebf3a8436 Apr 16 18:19:50.716139 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:50.716104 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-766767b746-px5db" event={"ID":"c072bf5a-1704-4832-a159-5d9feaf10bf4","Type":"ContainerStarted","Data":"e07e697a14835a50a40516de2d2a63f4d4224ef0b9f6b37ee2965feebf3a8436"} Apr 16 18:19:53.728557 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:53.728499 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-766767b746-px5db" event={"ID":"c072bf5a-1704-4832-a159-5d9feaf10bf4","Type":"ContainerStarted","Data":"6ec51f7036d29848c8cef8ae245056f61b2a9269bb6d59938608ddcea13a9a5d"} Apr 16 18:19:53.747655 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:19:53.747603 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-766767b746-px5db" podStartSLOduration=1.799572414 podStartE2EDuration="4.74758662s" podCreationTimestamp="2026-04-16 18:19:49 +0000 UTC" firstStartedPulling="2026-04-16 18:19:50.253463024 +0000 UTC m=+191.760728123" lastFinishedPulling="2026-04-16 18:19:53.20147722 +0000 UTC m=+194.708742329" observedRunningTime="2026-04-16 18:19:53.746094889 +0000 UTC m=+195.253360016" watchObservedRunningTime="2026-04-16 18:19:53.74758662 +0000 UTC m=+195.254851737" Apr 16 18:20:00.115451 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.115414 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-766767b746-px5db" Apr 16 18:20:00.115451 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.115462 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-766767b746-px5db" Apr 16 18:20:00.120449 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.120426 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-766767b746-px5db" Apr 16 18:20:00.384253 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.384225 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-84949b9f8-4wj2f"] Apr 16 18:20:00.395506 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.395482 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84949b9f8-4wj2f"] Apr 16 18:20:00.395687 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.395632 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:00.403330 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.403146 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:20:00.490346 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.490313 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2c8266e-9820-424c-a836-71c0ecf29c39-console-serving-cert\") pod \"console-84949b9f8-4wj2f\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:00.490346 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.490351 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-console-config\") pod \"console-84949b9f8-4wj2f\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:00.490581 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.490371 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2c8266e-9820-424c-a836-71c0ecf29c39-console-oauth-config\") pod \"console-84949b9f8-4wj2f\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:00.490581 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.490496 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-oauth-serving-cert\") pod \"console-84949b9f8-4wj2f\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:00.490661 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.490534 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-service-ca\") pod \"console-84949b9f8-4wj2f\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:00.490661 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.490608 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpqvh\" (UniqueName: \"kubernetes.io/projected/f2c8266e-9820-424c-a836-71c0ecf29c39-kube-api-access-kpqvh\") pod \"console-84949b9f8-4wj2f\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:00.490749 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.490671 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-trusted-ca-bundle\") pod \"console-84949b9f8-4wj2f\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:00.591854 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.591817 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-trusted-ca-bundle\") pod \"console-84949b9f8-4wj2f\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:00.591854 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.591858 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2c8266e-9820-424c-a836-71c0ecf29c39-console-serving-cert\") pod \"console-84949b9f8-4wj2f\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:00.592117 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.591874 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-console-config\") pod \"console-84949b9f8-4wj2f\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:00.592117 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.591998 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2c8266e-9820-424c-a836-71c0ecf29c39-console-oauth-config\") pod \"console-84949b9f8-4wj2f\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:00.592117 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.592092 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-oauth-serving-cert\") pod \"console-84949b9f8-4wj2f\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:00.592292 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.592136 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-service-ca\") pod \"console-84949b9f8-4wj2f\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:00.592346 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.592308 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kpqvh\" (UniqueName: \"kubernetes.io/projected/f2c8266e-9820-424c-a836-71c0ecf29c39-kube-api-access-kpqvh\") pod \"console-84949b9f8-4wj2f\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:00.592821 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.592767 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-oauth-serving-cert\") pod \"console-84949b9f8-4wj2f\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:00.592821 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.592789 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-service-ca\") pod \"console-84949b9f8-4wj2f\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:00.592821 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.592806 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-trusted-ca-bundle\") pod \"console-84949b9f8-4wj2f\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:00.593090 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.593049 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-console-config\") pod \"console-84949b9f8-4wj2f\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:00.594637 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.594621 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2c8266e-9820-424c-a836-71c0ecf29c39-console-serving-cert\") pod \"console-84949b9f8-4wj2f\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:00.594719 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.594699 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2c8266e-9820-424c-a836-71c0ecf29c39-console-oauth-config\") pod \"console-84949b9f8-4wj2f\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:00.604775 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.604751 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpqvh\" (UniqueName: \"kubernetes.io/projected/f2c8266e-9820-424c-a836-71c0ecf29c39-kube-api-access-kpqvh\") pod \"console-84949b9f8-4wj2f\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:00.706531 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.706482 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:00.754846 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.754589 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-bmlww" event={"ID":"635733b3-ada2-4619-ac36-00ecd1081e1e","Type":"ContainerStarted","Data":"34ae73139e1548f8282e64cc793971bbbb51ddd94bef6177b9437108e886410e"} Apr 16 18:20:00.756034 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.755140 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-bmlww" Apr 16 18:20:00.756617 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.756588 2573 patch_prober.go:28] interesting pod/downloads-586b57c7b4-bmlww container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.134.0.19:8080/\": dial tcp 10.134.0.19:8080: connect: connection refused" start-of-body= Apr 16 18:20:00.756723 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.756658 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-586b57c7b4-bmlww" podUID="635733b3-ada2-4619-ac36-00ecd1081e1e" containerName="download-server" probeResult="failure" output="Get \"http://10.134.0.19:8080/\": dial tcp 10.134.0.19:8080: connect: connection refused" Apr 16 18:20:00.759826 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.759804 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-766767b746-px5db" Apr 16 18:20:00.775501 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.775435 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-bmlww" podStartSLOduration=1.2803253479999999 podStartE2EDuration="17.775418521s" podCreationTimestamp="2026-04-16 18:19:43 +0000 UTC" firstStartedPulling="2026-04-16 18:19:44.101292333 +0000 UTC m=+185.608557427" lastFinishedPulling="2026-04-16 18:20:00.596385505 +0000 UTC m=+202.103650600" observedRunningTime="2026-04-16 18:20:00.773249728 +0000 UTC m=+202.280514838" watchObservedRunningTime="2026-04-16 18:20:00.775418521 +0000 UTC m=+202.282683637" Apr 16 18:20:00.843742 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:00.843679 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84949b9f8-4wj2f"] Apr 16 18:20:01.759857 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:01.759816 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84949b9f8-4wj2f" event={"ID":"f2c8266e-9820-424c-a836-71c0ecf29c39","Type":"ContainerStarted","Data":"5c1738435980179f1652bab7e442a8afd99c5d4fe5a2641696bf19adaff1c9c4"} Apr 16 18:20:01.759857 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:01.759861 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84949b9f8-4wj2f" event={"ID":"f2c8266e-9820-424c-a836-71c0ecf29c39","Type":"ContainerStarted","Data":"d6b356e6f35849838544fdc661083f5e3e91d1ce3344a11ec871ba49795beb51"} Apr 16 18:20:01.772037 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:01.772007 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-bmlww" Apr 16 18:20:01.783306 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:01.783248 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-84949b9f8-4wj2f" podStartSLOduration=1.783229581 podStartE2EDuration="1.783229581s" podCreationTimestamp="2026-04-16 18:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:20:01.782632637 +0000 UTC m=+203.289897765" watchObservedRunningTime="2026-04-16 18:20:01.783229581 +0000 UTC m=+203.290494698" Apr 16 18:20:10.706899 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:10.706862 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:10.707456 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:10.706910 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:10.711398 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:10.711379 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:10.791285 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:10.791254 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:20:10.843160 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:10.843129 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-766767b746-px5db"] Apr 16 18:20:17.811406 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:17.811369 2573 generic.go:358] "Generic (PLEG): container finished" podID="386777a9-63c1-4fa1-b894-4d73395765d3" containerID="21a96ecac47f482cb1ce6a75a1cfbfa19b243081ca30ebcda605ebf237df3daa" exitCode=0 Apr 16 18:20:17.811868 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:17.811416 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" event={"ID":"386777a9-63c1-4fa1-b894-4d73395765d3","Type":"ContainerDied","Data":"21a96ecac47f482cb1ce6a75a1cfbfa19b243081ca30ebcda605ebf237df3daa"} Apr 16 18:20:17.811868 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:17.811770 2573 scope.go:117] "RemoveContainer" containerID="21a96ecac47f482cb1ce6a75a1cfbfa19b243081ca30ebcda605ebf237df3daa" Apr 16 18:20:18.816214 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:18.816182 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-5hlbs" event={"ID":"386777a9-63c1-4fa1-b894-4d73395765d3","Type":"ContainerStarted","Data":"83d97c9b4f7d2c21e36d0fc8847bc5d093fe4cd32ca0e6cc75c067ead6bb8a74"} Apr 16 18:20:35.862006 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:35.861947 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-766767b746-px5db" podUID="c072bf5a-1704-4832-a159-5d9feaf10bf4" containerName="console" containerID="cri-o://6ec51f7036d29848c8cef8ae245056f61b2a9269bb6d59938608ddcea13a9a5d" gracePeriod=15 Apr 16 18:20:36.134355 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.134335 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-766767b746-px5db_c072bf5a-1704-4832-a159-5d9feaf10bf4/console/0.log" Apr 16 18:20:36.134466 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.134392 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-766767b746-px5db" Apr 16 18:20:36.208378 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.208344 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c072bf5a-1704-4832-a159-5d9feaf10bf4-oauth-serving-cert\") pod \"c072bf5a-1704-4832-a159-5d9feaf10bf4\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " Apr 16 18:20:36.208569 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.208394 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c072bf5a-1704-4832-a159-5d9feaf10bf4-console-config\") pod \"c072bf5a-1704-4832-a159-5d9feaf10bf4\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " Apr 16 18:20:36.208569 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.208435 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c072bf5a-1704-4832-a159-5d9feaf10bf4-console-serving-cert\") pod \"c072bf5a-1704-4832-a159-5d9feaf10bf4\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " Apr 16 18:20:36.208569 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.208469 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c072bf5a-1704-4832-a159-5d9feaf10bf4-console-oauth-config\") pod \"c072bf5a-1704-4832-a159-5d9feaf10bf4\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " Apr 16 18:20:36.208569 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.208533 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c072bf5a-1704-4832-a159-5d9feaf10bf4-service-ca\") pod \"c072bf5a-1704-4832-a159-5d9feaf10bf4\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " Apr 16 18:20:36.208773 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.208582 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtknt\" (UniqueName: \"kubernetes.io/projected/c072bf5a-1704-4832-a159-5d9feaf10bf4-kube-api-access-xtknt\") pod \"c072bf5a-1704-4832-a159-5d9feaf10bf4\" (UID: \"c072bf5a-1704-4832-a159-5d9feaf10bf4\") " Apr 16 18:20:36.208935 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.208838 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c072bf5a-1704-4832-a159-5d9feaf10bf4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c072bf5a-1704-4832-a159-5d9feaf10bf4" (UID: "c072bf5a-1704-4832-a159-5d9feaf10bf4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:20:36.208935 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.208843 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c072bf5a-1704-4832-a159-5d9feaf10bf4-console-config" (OuterVolumeSpecName: "console-config") pod "c072bf5a-1704-4832-a159-5d9feaf10bf4" (UID: "c072bf5a-1704-4832-a159-5d9feaf10bf4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:20:36.208935 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.208917 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c072bf5a-1704-4832-a159-5d9feaf10bf4-service-ca" (OuterVolumeSpecName: "service-ca") pod "c072bf5a-1704-4832-a159-5d9feaf10bf4" (UID: "c072bf5a-1704-4832-a159-5d9feaf10bf4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:20:36.210671 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.210646 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c072bf5a-1704-4832-a159-5d9feaf10bf4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c072bf5a-1704-4832-a159-5d9feaf10bf4" (UID: "c072bf5a-1704-4832-a159-5d9feaf10bf4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:36.210750 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.210720 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c072bf5a-1704-4832-a159-5d9feaf10bf4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c072bf5a-1704-4832-a159-5d9feaf10bf4" (UID: "c072bf5a-1704-4832-a159-5d9feaf10bf4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:36.210833 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.210812 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c072bf5a-1704-4832-a159-5d9feaf10bf4-kube-api-access-xtknt" (OuterVolumeSpecName: "kube-api-access-xtknt") pod "c072bf5a-1704-4832-a159-5d9feaf10bf4" (UID: "c072bf5a-1704-4832-a159-5d9feaf10bf4"). InnerVolumeSpecName "kube-api-access-xtknt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:20:36.309453 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.309421 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c072bf5a-1704-4832-a159-5d9feaf10bf4-oauth-serving-cert\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:20:36.309453 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.309451 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c072bf5a-1704-4832-a159-5d9feaf10bf4-console-config\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:20:36.309673 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.309463 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c072bf5a-1704-4832-a159-5d9feaf10bf4-console-serving-cert\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:20:36.309673 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.309472 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c072bf5a-1704-4832-a159-5d9feaf10bf4-console-oauth-config\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:20:36.309673 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.309481 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c072bf5a-1704-4832-a159-5d9feaf10bf4-service-ca\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:20:36.309673 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.309490 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xtknt\" (UniqueName: \"kubernetes.io/projected/c072bf5a-1704-4832-a159-5d9feaf10bf4-kube-api-access-xtknt\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:20:36.871519 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.871491 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-766767b746-px5db_c072bf5a-1704-4832-a159-5d9feaf10bf4/console/0.log" Apr 16 18:20:36.871964 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.871533 2573 generic.go:358] "Generic (PLEG): container finished" podID="c072bf5a-1704-4832-a159-5d9feaf10bf4" containerID="6ec51f7036d29848c8cef8ae245056f61b2a9269bb6d59938608ddcea13a9a5d" exitCode=2 Apr 16 18:20:36.871964 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.871602 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-766767b746-px5db" event={"ID":"c072bf5a-1704-4832-a159-5d9feaf10bf4","Type":"ContainerDied","Data":"6ec51f7036d29848c8cef8ae245056f61b2a9269bb6d59938608ddcea13a9a5d"} Apr 16 18:20:36.871964 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.871625 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-766767b746-px5db" Apr 16 18:20:36.871964 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.871632 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-766767b746-px5db" event={"ID":"c072bf5a-1704-4832-a159-5d9feaf10bf4","Type":"ContainerDied","Data":"e07e697a14835a50a40516de2d2a63f4d4224ef0b9f6b37ee2965feebf3a8436"} Apr 16 18:20:36.871964 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.871652 2573 scope.go:117] "RemoveContainer" containerID="6ec51f7036d29848c8cef8ae245056f61b2a9269bb6d59938608ddcea13a9a5d" Apr 16 18:20:36.879457 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.879440 2573 scope.go:117] "RemoveContainer" containerID="6ec51f7036d29848c8cef8ae245056f61b2a9269bb6d59938608ddcea13a9a5d" Apr 16 18:20:36.879730 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:20:36.879709 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ec51f7036d29848c8cef8ae245056f61b2a9269bb6d59938608ddcea13a9a5d\": container with ID starting with 6ec51f7036d29848c8cef8ae245056f61b2a9269bb6d59938608ddcea13a9a5d not found: ID does not exist" containerID="6ec51f7036d29848c8cef8ae245056f61b2a9269bb6d59938608ddcea13a9a5d" Apr 16 18:20:36.879799 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.879739 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ec51f7036d29848c8cef8ae245056f61b2a9269bb6d59938608ddcea13a9a5d"} err="failed to get container status \"6ec51f7036d29848c8cef8ae245056f61b2a9269bb6d59938608ddcea13a9a5d\": rpc error: code = NotFound desc = could not find container \"6ec51f7036d29848c8cef8ae245056f61b2a9269bb6d59938608ddcea13a9a5d\": container with ID starting with 6ec51f7036d29848c8cef8ae245056f61b2a9269bb6d59938608ddcea13a9a5d not found: ID does not exist" Apr 16 18:20:36.894768 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.894746 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-766767b746-px5db"] Apr 16 18:20:36.899902 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:36.899880 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-766767b746-px5db"] Apr 16 18:20:37.120436 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:37.120402 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c072bf5a-1704-4832-a159-5d9feaf10bf4" path="/var/lib/kubelet/pods/c072bf5a-1704-4832-a159-5d9feaf10bf4/volumes" Apr 16 18:20:49.826617 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:49.826580 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs\") pod \"network-metrics-daemon-sw2bl\" (UID: \"a7930c14-4ef0-4949-a2ae-9a240da66c3c\") " pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:20:49.828845 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:49.828826 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7930c14-4ef0-4949-a2ae-9a240da66c3c-metrics-certs\") pod \"network-metrics-daemon-sw2bl\" (UID: \"a7930c14-4ef0-4949-a2ae-9a240da66c3c\") " pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:20:50.019770 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:50.019737 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pwdr4\"" Apr 16 18:20:50.028381 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:50.028351 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sw2bl" Apr 16 18:20:50.172219 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:50.172187 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sw2bl"] Apr 16 18:20:50.175168 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:20:50.175139 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7930c14_4ef0_4949_a2ae_9a240da66c3c.slice/crio-25956abdd3e6c35551df2dee2b892094e0eeab18f89d4edc58d1f876a67abbb0 WatchSource:0}: Error finding container 25956abdd3e6c35551df2dee2b892094e0eeab18f89d4edc58d1f876a67abbb0: Status 404 returned error can't find the container with id 25956abdd3e6c35551df2dee2b892094e0eeab18f89d4edc58d1f876a67abbb0 Apr 16 18:20:50.913953 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:50.913905 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sw2bl" event={"ID":"a7930c14-4ef0-4949-a2ae-9a240da66c3c","Type":"ContainerStarted","Data":"25956abdd3e6c35551df2dee2b892094e0eeab18f89d4edc58d1f876a67abbb0"} Apr 16 18:20:51.917937 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:51.917899 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sw2bl" event={"ID":"a7930c14-4ef0-4949-a2ae-9a240da66c3c","Type":"ContainerStarted","Data":"73141caba9042ddc2ca1a55c732974e5a8f7a0d3d62fc154f9f74b4220615f51"} Apr 16 18:20:51.917937 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:51.917938 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sw2bl" event={"ID":"a7930c14-4ef0-4949-a2ae-9a240da66c3c","Type":"ContainerStarted","Data":"fc19f3691cb0c13cc004ebf1690dda50f80e20acf62b6f4ebe6bb114248a9547"} Apr 16 18:20:51.934251 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:51.934199 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-sw2bl" podStartSLOduration=252.090115189 podStartE2EDuration="4m12.934186661s" podCreationTimestamp="2026-04-16 18:16:39 +0000 UTC" firstStartedPulling="2026-04-16 18:20:50.176921791 +0000 UTC m=+251.684186889" lastFinishedPulling="2026-04-16 18:20:51.020993263 +0000 UTC m=+252.528258361" observedRunningTime="2026-04-16 18:20:51.93285592 +0000 UTC m=+253.440121049" watchObservedRunningTime="2026-04-16 18:20:51.934186661 +0000 UTC m=+253.441451776" Apr 16 18:20:53.279302 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.279254 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6b965755b-2rjdw"] Apr 16 18:20:53.279871 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.279834 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c072bf5a-1704-4832-a159-5d9feaf10bf4" containerName="console" Apr 16 18:20:53.279997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.279922 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c072bf5a-1704-4832-a159-5d9feaf10bf4" containerName="console" Apr 16 18:20:53.280066 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.280041 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c072bf5a-1704-4832-a159-5d9feaf10bf4" containerName="console" Apr 16 18:20:53.283219 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.283197 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:20:53.295162 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.295138 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b965755b-2rjdw"] Apr 16 18:20:53.358433 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.358401 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2859a692-31fc-43d0-a10d-d94fd969729c-console-serving-cert\") pod \"console-6b965755b-2rjdw\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:20:53.358614 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.358435 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2859a692-31fc-43d0-a10d-d94fd969729c-console-oauth-config\") pod \"console-6b965755b-2rjdw\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:20:53.358614 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.358495 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-oauth-serving-cert\") pod \"console-6b965755b-2rjdw\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:20:53.358614 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.358519 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-console-config\") pod \"console-6b965755b-2rjdw\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:20:53.358614 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.358557 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2zsv\" (UniqueName: \"kubernetes.io/projected/2859a692-31fc-43d0-a10d-d94fd969729c-kube-api-access-s2zsv\") pod \"console-6b965755b-2rjdw\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:20:53.358614 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.358604 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-trusted-ca-bundle\") pod \"console-6b965755b-2rjdw\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:20:53.358783 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.358664 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-service-ca\") pod \"console-6b965755b-2rjdw\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:20:53.459764 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.459728 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2859a692-31fc-43d0-a10d-d94fd969729c-console-serving-cert\") pod \"console-6b965755b-2rjdw\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:20:53.459764 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.459767 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2859a692-31fc-43d0-a10d-d94fd969729c-console-oauth-config\") pod \"console-6b965755b-2rjdw\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:20:53.459969 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.459795 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-oauth-serving-cert\") pod \"console-6b965755b-2rjdw\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:20:53.459969 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.459820 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-console-config\") pod \"console-6b965755b-2rjdw\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:20:53.459969 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.459837 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2zsv\" (UniqueName: \"kubernetes.io/projected/2859a692-31fc-43d0-a10d-d94fd969729c-kube-api-access-s2zsv\") pod \"console-6b965755b-2rjdw\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:20:53.459969 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.459861 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-trusted-ca-bundle\") pod \"console-6b965755b-2rjdw\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:20:53.459969 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.459879 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-service-ca\") pod \"console-6b965755b-2rjdw\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:20:53.460676 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.460648 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-oauth-serving-cert\") pod \"console-6b965755b-2rjdw\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:20:53.460787 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.460656 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-console-config\") pod \"console-6b965755b-2rjdw\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:20:53.460787 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.460743 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-service-ca\") pod \"console-6b965755b-2rjdw\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:20:53.460859 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.460791 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-trusted-ca-bundle\") pod \"console-6b965755b-2rjdw\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:20:53.462234 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.462211 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2859a692-31fc-43d0-a10d-d94fd969729c-console-serving-cert\") pod \"console-6b965755b-2rjdw\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:20:53.462325 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.462256 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2859a692-31fc-43d0-a10d-d94fd969729c-console-oauth-config\") pod \"console-6b965755b-2rjdw\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:20:53.467913 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.467891 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2zsv\" (UniqueName: \"kubernetes.io/projected/2859a692-31fc-43d0-a10d-d94fd969729c-kube-api-access-s2zsv\") pod \"console-6b965755b-2rjdw\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:20:53.594501 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.594420 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:20:53.917995 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.917928 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b965755b-2rjdw"] Apr 16 18:20:53.920435 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:20:53.920410 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2859a692_31fc_43d0_a10d_d94fd969729c.slice/crio-a01d9e275c44f74f212ee33e00237471e5037b510a99be847c6b417942ac8e38 WatchSource:0}: Error finding container a01d9e275c44f74f212ee33e00237471e5037b510a99be847c6b417942ac8e38: Status 404 returned error can't find the container with id a01d9e275c44f74f212ee33e00237471e5037b510a99be847c6b417942ac8e38 Apr 16 18:20:53.924504 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:53.924481 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b965755b-2rjdw" event={"ID":"2859a692-31fc-43d0-a10d-d94fd969729c","Type":"ContainerStarted","Data":"a01d9e275c44f74f212ee33e00237471e5037b510a99be847c6b417942ac8e38"} Apr 16 18:20:54.928940 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:54.928906 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b965755b-2rjdw" event={"ID":"2859a692-31fc-43d0-a10d-d94fd969729c","Type":"ContainerStarted","Data":"d01bdfb3bcb6c24bb1014a20692e0383d0ab0fd7fdab7f57591bcf879a151ded"} Apr 16 18:20:54.948204 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:20:54.948154 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6b965755b-2rjdw" podStartSLOduration=1.9481398680000002 podStartE2EDuration="1.948139868s" podCreationTimestamp="2026-04-16 18:20:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:20:54.946011024 +0000 UTC m=+256.453276151" watchObservedRunningTime="2026-04-16 18:20:54.948139868 +0000 UTC m=+256.455404984" Apr 16 18:21:03.595291 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:03.595254 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:21:03.595757 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:03.595523 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:21:03.600001 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:03.599976 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:21:03.959118 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:03.959094 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:21:04.010519 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:04.010477 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84949b9f8-4wj2f"] Apr 16 18:21:29.034345 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.034289 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-84949b9f8-4wj2f" podUID="f2c8266e-9820-424c-a836-71c0ecf29c39" containerName="console" containerID="cri-o://5c1738435980179f1652bab7e442a8afd99c5d4fe5a2641696bf19adaff1c9c4" gracePeriod=15 Apr 16 18:21:29.265765 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.265743 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84949b9f8-4wj2f_f2c8266e-9820-424c-a836-71c0ecf29c39/console/0.log" Apr 16 18:21:29.265887 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.265805 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:21:29.350925 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.350836 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-console-config\") pod \"f2c8266e-9820-424c-a836-71c0ecf29c39\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " Apr 16 18:21:29.350925 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.350918 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2c8266e-9820-424c-a836-71c0ecf29c39-console-serving-cert\") pod \"f2c8266e-9820-424c-a836-71c0ecf29c39\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " Apr 16 18:21:29.351134 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.350936 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2c8266e-9820-424c-a836-71c0ecf29c39-console-oauth-config\") pod \"f2c8266e-9820-424c-a836-71c0ecf29c39\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " Apr 16 18:21:29.351134 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.350953 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpqvh\" (UniqueName: \"kubernetes.io/projected/f2c8266e-9820-424c-a836-71c0ecf29c39-kube-api-access-kpqvh\") pod \"f2c8266e-9820-424c-a836-71c0ecf29c39\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " Apr 16 18:21:29.351134 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.350970 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-trusted-ca-bundle\") pod \"f2c8266e-9820-424c-a836-71c0ecf29c39\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " Apr 16 18:21:29.351134 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.351011 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-service-ca\") pod \"f2c8266e-9820-424c-a836-71c0ecf29c39\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " Apr 16 18:21:29.351134 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.351038 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-oauth-serving-cert\") pod \"f2c8266e-9820-424c-a836-71c0ecf29c39\" (UID: \"f2c8266e-9820-424c-a836-71c0ecf29c39\") " Apr 16 18:21:29.351426 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.351275 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-console-config" (OuterVolumeSpecName: "console-config") pod "f2c8266e-9820-424c-a836-71c0ecf29c39" (UID: "f2c8266e-9820-424c-a836-71c0ecf29c39"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:21:29.351524 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.351456 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f2c8266e-9820-424c-a836-71c0ecf29c39" (UID: "f2c8266e-9820-424c-a836-71c0ecf29c39"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:21:29.351617 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.351490 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-service-ca" (OuterVolumeSpecName: "service-ca") pod "f2c8266e-9820-424c-a836-71c0ecf29c39" (UID: "f2c8266e-9820-424c-a836-71c0ecf29c39"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:21:29.351689 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.351670 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f2c8266e-9820-424c-a836-71c0ecf29c39" (UID: "f2c8266e-9820-424c-a836-71c0ecf29c39"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:21:29.353135 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.353111 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2c8266e-9820-424c-a836-71c0ecf29c39-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f2c8266e-9820-424c-a836-71c0ecf29c39" (UID: "f2c8266e-9820-424c-a836-71c0ecf29c39"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:29.353507 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.353486 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2c8266e-9820-424c-a836-71c0ecf29c39-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f2c8266e-9820-424c-a836-71c0ecf29c39" (UID: "f2c8266e-9820-424c-a836-71c0ecf29c39"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:29.353575 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.353533 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2c8266e-9820-424c-a836-71c0ecf29c39-kube-api-access-kpqvh" (OuterVolumeSpecName: "kube-api-access-kpqvh") pod "f2c8266e-9820-424c-a836-71c0ecf29c39" (UID: "f2c8266e-9820-424c-a836-71c0ecf29c39"). InnerVolumeSpecName "kube-api-access-kpqvh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:21:29.452173 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.452143 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-service-ca\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:21:29.452173 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.452169 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-oauth-serving-cert\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:21:29.452173 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.452180 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-console-config\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:21:29.452385 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.452190 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2c8266e-9820-424c-a836-71c0ecf29c39-console-serving-cert\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:21:29.452385 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.452198 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2c8266e-9820-424c-a836-71c0ecf29c39-console-oauth-config\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:21:29.452385 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.452209 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kpqvh\" (UniqueName: \"kubernetes.io/projected/f2c8266e-9820-424c-a836-71c0ecf29c39-kube-api-access-kpqvh\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:21:29.452385 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:29.452218 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2c8266e-9820-424c-a836-71c0ecf29c39-trusted-ca-bundle\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:21:30.027412 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:30.027382 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84949b9f8-4wj2f_f2c8266e-9820-424c-a836-71c0ecf29c39/console/0.log" Apr 16 18:21:30.027600 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:30.027424 2573 generic.go:358] "Generic (PLEG): container finished" podID="f2c8266e-9820-424c-a836-71c0ecf29c39" containerID="5c1738435980179f1652bab7e442a8afd99c5d4fe5a2641696bf19adaff1c9c4" exitCode=2 Apr 16 18:21:30.027600 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:30.027458 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84949b9f8-4wj2f" event={"ID":"f2c8266e-9820-424c-a836-71c0ecf29c39","Type":"ContainerDied","Data":"5c1738435980179f1652bab7e442a8afd99c5d4fe5a2641696bf19adaff1c9c4"} Apr 16 18:21:30.027600 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:30.027486 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84949b9f8-4wj2f" event={"ID":"f2c8266e-9820-424c-a836-71c0ecf29c39","Type":"ContainerDied","Data":"d6b356e6f35849838544fdc661083f5e3e91d1ce3344a11ec871ba49795beb51"} Apr 16 18:21:30.027600 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:30.027504 2573 scope.go:117] "RemoveContainer" containerID="5c1738435980179f1652bab7e442a8afd99c5d4fe5a2641696bf19adaff1c9c4" Apr 16 18:21:30.027600 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:30.027525 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84949b9f8-4wj2f" Apr 16 18:21:30.035295 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:30.035133 2573 scope.go:117] "RemoveContainer" containerID="5c1738435980179f1652bab7e442a8afd99c5d4fe5a2641696bf19adaff1c9c4" Apr 16 18:21:30.035500 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:21:30.035357 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c1738435980179f1652bab7e442a8afd99c5d4fe5a2641696bf19adaff1c9c4\": container with ID starting with 5c1738435980179f1652bab7e442a8afd99c5d4fe5a2641696bf19adaff1c9c4 not found: ID does not exist" containerID="5c1738435980179f1652bab7e442a8afd99c5d4fe5a2641696bf19adaff1c9c4" Apr 16 18:21:30.035500 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:30.035382 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c1738435980179f1652bab7e442a8afd99c5d4fe5a2641696bf19adaff1c9c4"} err="failed to get container status \"5c1738435980179f1652bab7e442a8afd99c5d4fe5a2641696bf19adaff1c9c4\": rpc error: code = NotFound desc = could not find container \"5c1738435980179f1652bab7e442a8afd99c5d4fe5a2641696bf19adaff1c9c4\": container with ID starting with 5c1738435980179f1652bab7e442a8afd99c5d4fe5a2641696bf19adaff1c9c4 not found: ID does not exist" Apr 16 18:21:30.050731 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:30.050707 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84949b9f8-4wj2f"] Apr 16 18:21:30.053950 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:30.053930 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-84949b9f8-4wj2f"] Apr 16 18:21:31.121140 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:31.121106 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2c8266e-9820-424c-a836-71c0ecf29c39" path="/var/lib/kubelet/pods/f2c8266e-9820-424c-a836-71c0ecf29c39/volumes" Apr 16 18:21:38.986332 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:38.986291 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-dn4qx_a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0/console-operator/2.log" Apr 16 18:21:38.987629 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:38.987605 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-dn4qx_a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0/console-operator/2.log" Apr 16 18:21:38.996113 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:21:38.996096 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:22:06.404479 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.404445 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5cfcbb68d5-77gc6"] Apr 16 18:22:06.406946 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.404778 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2c8266e-9820-424c-a836-71c0ecf29c39" containerName="console" Apr 16 18:22:06.406946 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.404790 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2c8266e-9820-424c-a836-71c0ecf29c39" containerName="console" Apr 16 18:22:06.406946 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.404840 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f2c8266e-9820-424c-a836-71c0ecf29c39" containerName="console" Apr 16 18:22:06.407871 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.407850 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:06.419866 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.419838 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5cfcbb68d5-77gc6"] Apr 16 18:22:06.437951 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.437922 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-console-config\") pod \"console-5cfcbb68d5-77gc6\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:06.438070 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.437969 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62hkn\" (UniqueName: \"kubernetes.io/projected/db4fd461-bc31-4928-9356-791590b02269-kube-api-access-62hkn\") pod \"console-5cfcbb68d5-77gc6\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:06.438070 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.438006 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-oauth-serving-cert\") pod \"console-5cfcbb68d5-77gc6\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:06.438070 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.438050 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-trusted-ca-bundle\") pod \"console-5cfcbb68d5-77gc6\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:06.438187 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.438072 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db4fd461-bc31-4928-9356-791590b02269-console-serving-cert\") pod \"console-5cfcbb68d5-77gc6\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:06.438187 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.438094 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-service-ca\") pod \"console-5cfcbb68d5-77gc6\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:06.438187 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.438111 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db4fd461-bc31-4928-9356-791590b02269-console-oauth-config\") pod \"console-5cfcbb68d5-77gc6\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:06.539054 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.539020 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62hkn\" (UniqueName: \"kubernetes.io/projected/db4fd461-bc31-4928-9356-791590b02269-kube-api-access-62hkn\") pod \"console-5cfcbb68d5-77gc6\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:06.539218 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.539069 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-oauth-serving-cert\") pod \"console-5cfcbb68d5-77gc6\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:06.539218 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.539115 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-trusted-ca-bundle\") pod \"console-5cfcbb68d5-77gc6\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:06.539218 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.539139 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db4fd461-bc31-4928-9356-791590b02269-console-serving-cert\") pod \"console-5cfcbb68d5-77gc6\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:06.539218 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.539155 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-service-ca\") pod \"console-5cfcbb68d5-77gc6\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:06.539218 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.539172 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db4fd461-bc31-4928-9356-791590b02269-console-oauth-config\") pod \"console-5cfcbb68d5-77gc6\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:06.539218 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.539189 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-console-config\") pod \"console-5cfcbb68d5-77gc6\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:06.539977 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.539951 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-console-config\") pod \"console-5cfcbb68d5-77gc6\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:06.540095 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.539956 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-oauth-serving-cert\") pod \"console-5cfcbb68d5-77gc6\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:06.540095 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.540016 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-trusted-ca-bundle\") pod \"console-5cfcbb68d5-77gc6\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:06.540095 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.540038 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-service-ca\") pod \"console-5cfcbb68d5-77gc6\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:06.541832 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.541810 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db4fd461-bc31-4928-9356-791590b02269-console-oauth-config\") pod \"console-5cfcbb68d5-77gc6\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:06.541915 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.541864 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db4fd461-bc31-4928-9356-791590b02269-console-serving-cert\") pod \"console-5cfcbb68d5-77gc6\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:06.547684 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.547664 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62hkn\" (UniqueName: \"kubernetes.io/projected/db4fd461-bc31-4928-9356-791590b02269-kube-api-access-62hkn\") pod \"console-5cfcbb68d5-77gc6\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:06.718508 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.718426 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:06.843529 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.843498 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5cfcbb68d5-77gc6"] Apr 16 18:22:06.846291 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:22:06.846264 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb4fd461_bc31_4928_9356_791590b02269.slice/crio-b8319998d65dd6b53561425d6622f347defbad627685080675c337fbd87557df WatchSource:0}: Error finding container b8319998d65dd6b53561425d6622f347defbad627685080675c337fbd87557df: Status 404 returned error can't find the container with id b8319998d65dd6b53561425d6622f347defbad627685080675c337fbd87557df Apr 16 18:22:06.848047 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:06.848030 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:22:07.130178 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:07.130142 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cfcbb68d5-77gc6" event={"ID":"db4fd461-bc31-4928-9356-791590b02269","Type":"ContainerStarted","Data":"3e65c1b0ee0f1481be032be5524a3ea5e8c7f564e9bb5e3327402b43b1db4a26"} Apr 16 18:22:07.130178 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:07.130182 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cfcbb68d5-77gc6" event={"ID":"db4fd461-bc31-4928-9356-791590b02269","Type":"ContainerStarted","Data":"b8319998d65dd6b53561425d6622f347defbad627685080675c337fbd87557df"} Apr 16 18:22:07.150077 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:07.150031 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5cfcbb68d5-77gc6" podStartSLOduration=1.150018481 podStartE2EDuration="1.150018481s" podCreationTimestamp="2026-04-16 18:22:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:22:07.14809315 +0000 UTC m=+328.655358266" watchObservedRunningTime="2026-04-16 18:22:07.150018481 +0000 UTC m=+328.657283596" Apr 16 18:22:16.719296 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:16.719256 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:16.719759 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:16.719340 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:16.724021 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:16.724001 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:17.159419 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:17.159395 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:22:17.211684 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:17.211652 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b965755b-2rjdw"] Apr 16 18:22:42.237438 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.237311 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6b965755b-2rjdw" podUID="2859a692-31fc-43d0-a10d-d94fd969729c" containerName="console" containerID="cri-o://d01bdfb3bcb6c24bb1014a20692e0383d0ab0fd7fdab7f57591bcf879a151ded" gracePeriod=15 Apr 16 18:22:42.466461 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.466440 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b965755b-2rjdw_2859a692-31fc-43d0-a10d-d94fd969729c/console/0.log" Apr 16 18:22:42.466575 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.466498 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:22:42.530749 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.530674 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2859a692-31fc-43d0-a10d-d94fd969729c-console-serving-cert\") pod \"2859a692-31fc-43d0-a10d-d94fd969729c\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " Apr 16 18:22:42.530749 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.530708 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2zsv\" (UniqueName: \"kubernetes.io/projected/2859a692-31fc-43d0-a10d-d94fd969729c-kube-api-access-s2zsv\") pod \"2859a692-31fc-43d0-a10d-d94fd969729c\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " Apr 16 18:22:42.530749 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.530738 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-oauth-serving-cert\") pod \"2859a692-31fc-43d0-a10d-d94fd969729c\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " Apr 16 18:22:42.530997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.530760 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-console-config\") pod \"2859a692-31fc-43d0-a10d-d94fd969729c\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " Apr 16 18:22:42.530997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.530811 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-service-ca\") pod \"2859a692-31fc-43d0-a10d-d94fd969729c\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " Apr 16 18:22:42.530997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.530848 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-trusted-ca-bundle\") pod \"2859a692-31fc-43d0-a10d-d94fd969729c\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " Apr 16 18:22:42.530997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.530867 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2859a692-31fc-43d0-a10d-d94fd969729c-console-oauth-config\") pod \"2859a692-31fc-43d0-a10d-d94fd969729c\" (UID: \"2859a692-31fc-43d0-a10d-d94fd969729c\") " Apr 16 18:22:42.531260 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.531231 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2859a692-31fc-43d0-a10d-d94fd969729c" (UID: "2859a692-31fc-43d0-a10d-d94fd969729c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:22:42.531326 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.531278 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-console-config" (OuterVolumeSpecName: "console-config") pod "2859a692-31fc-43d0-a10d-d94fd969729c" (UID: "2859a692-31fc-43d0-a10d-d94fd969729c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:22:42.531378 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.531339 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-service-ca" (OuterVolumeSpecName: "service-ca") pod "2859a692-31fc-43d0-a10d-d94fd969729c" (UID: "2859a692-31fc-43d0-a10d-d94fd969729c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:22:42.531378 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.531350 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2859a692-31fc-43d0-a10d-d94fd969729c" (UID: "2859a692-31fc-43d0-a10d-d94fd969729c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:22:42.532907 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.532874 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2859a692-31fc-43d0-a10d-d94fd969729c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2859a692-31fc-43d0-a10d-d94fd969729c" (UID: "2859a692-31fc-43d0-a10d-d94fd969729c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:42.532997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.532913 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2859a692-31fc-43d0-a10d-d94fd969729c-kube-api-access-s2zsv" (OuterVolumeSpecName: "kube-api-access-s2zsv") pod "2859a692-31fc-43d0-a10d-d94fd969729c" (UID: "2859a692-31fc-43d0-a10d-d94fd969729c"). InnerVolumeSpecName "kube-api-access-s2zsv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:22:42.532997 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.532941 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2859a692-31fc-43d0-a10d-d94fd969729c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2859a692-31fc-43d0-a10d-d94fd969729c" (UID: "2859a692-31fc-43d0-a10d-d94fd969729c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:42.632144 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.632105 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-trusted-ca-bundle\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:22:42.632144 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.632137 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2859a692-31fc-43d0-a10d-d94fd969729c-console-oauth-config\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:22:42.632144 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.632147 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2859a692-31fc-43d0-a10d-d94fd969729c-console-serving-cert\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:22:42.632377 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.632159 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s2zsv\" (UniqueName: \"kubernetes.io/projected/2859a692-31fc-43d0-a10d-d94fd969729c-kube-api-access-s2zsv\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:22:42.632377 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.632169 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-oauth-serving-cert\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:22:42.632377 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.632178 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-console-config\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:22:42.632377 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:42.632187 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2859a692-31fc-43d0-a10d-d94fd969729c-service-ca\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:22:43.227039 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:43.227013 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b965755b-2rjdw_2859a692-31fc-43d0-a10d-d94fd969729c/console/0.log" Apr 16 18:22:43.227193 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:43.227051 2573 generic.go:358] "Generic (PLEG): container finished" podID="2859a692-31fc-43d0-a10d-d94fd969729c" containerID="d01bdfb3bcb6c24bb1014a20692e0383d0ab0fd7fdab7f57591bcf879a151ded" exitCode=2 Apr 16 18:22:43.227193 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:43.227108 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b965755b-2rjdw" event={"ID":"2859a692-31fc-43d0-a10d-d94fd969729c","Type":"ContainerDied","Data":"d01bdfb3bcb6c24bb1014a20692e0383d0ab0fd7fdab7f57591bcf879a151ded"} Apr 16 18:22:43.227193 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:43.227118 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b965755b-2rjdw" Apr 16 18:22:43.227193 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:43.227135 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b965755b-2rjdw" event={"ID":"2859a692-31fc-43d0-a10d-d94fd969729c","Type":"ContainerDied","Data":"a01d9e275c44f74f212ee33e00237471e5037b510a99be847c6b417942ac8e38"} Apr 16 18:22:43.227193 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:43.227150 2573 scope.go:117] "RemoveContainer" containerID="d01bdfb3bcb6c24bb1014a20692e0383d0ab0fd7fdab7f57591bcf879a151ded" Apr 16 18:22:43.234632 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:43.234613 2573 scope.go:117] "RemoveContainer" containerID="d01bdfb3bcb6c24bb1014a20692e0383d0ab0fd7fdab7f57591bcf879a151ded" Apr 16 18:22:43.234884 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:22:43.234851 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d01bdfb3bcb6c24bb1014a20692e0383d0ab0fd7fdab7f57591bcf879a151ded\": container with ID starting with d01bdfb3bcb6c24bb1014a20692e0383d0ab0fd7fdab7f57591bcf879a151ded not found: ID does not exist" containerID="d01bdfb3bcb6c24bb1014a20692e0383d0ab0fd7fdab7f57591bcf879a151ded" Apr 16 18:22:43.234974 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:43.234884 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01bdfb3bcb6c24bb1014a20692e0383d0ab0fd7fdab7f57591bcf879a151ded"} err="failed to get container status \"d01bdfb3bcb6c24bb1014a20692e0383d0ab0fd7fdab7f57591bcf879a151ded\": rpc error: code = NotFound desc = could not find container \"d01bdfb3bcb6c24bb1014a20692e0383d0ab0fd7fdab7f57591bcf879a151ded\": container with ID starting with d01bdfb3bcb6c24bb1014a20692e0383d0ab0fd7fdab7f57591bcf879a151ded not found: ID does not exist" Apr 16 18:22:43.245177 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:43.245155 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b965755b-2rjdw"] Apr 16 18:22:43.250283 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:43.250265 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6b965755b-2rjdw"] Apr 16 18:22:45.120582 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:45.120532 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2859a692-31fc-43d0-a10d-d94fd969729c" path="/var/lib/kubelet/pods/2859a692-31fc-43d0-a10d-d94fd969729c/volumes" Apr 16 18:22:47.542525 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:47.542490 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-xnggq"] Apr 16 18:22:47.542894 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:47.542836 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2859a692-31fc-43d0-a10d-d94fd969729c" containerName="console" Apr 16 18:22:47.542894 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:47.542849 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2859a692-31fc-43d0-a10d-d94fd969729c" containerName="console" Apr 16 18:22:47.542974 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:47.542911 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2859a692-31fc-43d0-a10d-d94fd969729c" containerName="console" Apr 16 18:22:47.547393 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:47.547374 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xnggq" Apr 16 18:22:47.550537 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:47.550522 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:22:47.556855 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:47.556833 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xnggq"] Apr 16 18:22:47.673432 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:47.673381 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7da343e6-5030-4252-8d63-0b49d5c3ff00-original-pull-secret\") pod \"global-pull-secret-syncer-xnggq\" (UID: \"7da343e6-5030-4252-8d63-0b49d5c3ff00\") " pod="kube-system/global-pull-secret-syncer-xnggq" Apr 16 18:22:47.673667 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:47.673505 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7da343e6-5030-4252-8d63-0b49d5c3ff00-kubelet-config\") pod \"global-pull-secret-syncer-xnggq\" (UID: \"7da343e6-5030-4252-8d63-0b49d5c3ff00\") " pod="kube-system/global-pull-secret-syncer-xnggq" Apr 16 18:22:47.673667 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:47.673587 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7da343e6-5030-4252-8d63-0b49d5c3ff00-dbus\") pod \"global-pull-secret-syncer-xnggq\" (UID: \"7da343e6-5030-4252-8d63-0b49d5c3ff00\") " pod="kube-system/global-pull-secret-syncer-xnggq" Apr 16 18:22:47.774218 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:47.774174 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7da343e6-5030-4252-8d63-0b49d5c3ff00-kubelet-config\") pod \"global-pull-secret-syncer-xnggq\" (UID: \"7da343e6-5030-4252-8d63-0b49d5c3ff00\") " pod="kube-system/global-pull-secret-syncer-xnggq" Apr 16 18:22:47.774356 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:47.774229 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7da343e6-5030-4252-8d63-0b49d5c3ff00-dbus\") pod \"global-pull-secret-syncer-xnggq\" (UID: \"7da343e6-5030-4252-8d63-0b49d5c3ff00\") " pod="kube-system/global-pull-secret-syncer-xnggq" Apr 16 18:22:47.774356 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:47.774254 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7da343e6-5030-4252-8d63-0b49d5c3ff00-original-pull-secret\") pod \"global-pull-secret-syncer-xnggq\" (UID: \"7da343e6-5030-4252-8d63-0b49d5c3ff00\") " pod="kube-system/global-pull-secret-syncer-xnggq" Apr 16 18:22:47.774356 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:47.774319 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7da343e6-5030-4252-8d63-0b49d5c3ff00-kubelet-config\") pod \"global-pull-secret-syncer-xnggq\" (UID: \"7da343e6-5030-4252-8d63-0b49d5c3ff00\") " pod="kube-system/global-pull-secret-syncer-xnggq" Apr 16 18:22:47.774464 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:47.774421 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7da343e6-5030-4252-8d63-0b49d5c3ff00-dbus\") pod \"global-pull-secret-syncer-xnggq\" (UID: \"7da343e6-5030-4252-8d63-0b49d5c3ff00\") " pod="kube-system/global-pull-secret-syncer-xnggq" Apr 16 18:22:47.776628 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:47.776609 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7da343e6-5030-4252-8d63-0b49d5c3ff00-original-pull-secret\") pod \"global-pull-secret-syncer-xnggq\" (UID: \"7da343e6-5030-4252-8d63-0b49d5c3ff00\") " pod="kube-system/global-pull-secret-syncer-xnggq" Apr 16 18:22:47.856396 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:47.856308 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xnggq" Apr 16 18:22:47.992417 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:47.992389 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xnggq"] Apr 16 18:22:47.994944 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:22:47.994911 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7da343e6_5030_4252_8d63_0b49d5c3ff00.slice/crio-f9306309a8dbf4c48c4aca725b5f5f8cb2fa936809abb37e39845d12b06c95cd WatchSource:0}: Error finding container f9306309a8dbf4c48c4aca725b5f5f8cb2fa936809abb37e39845d12b06c95cd: Status 404 returned error can't find the container with id f9306309a8dbf4c48c4aca725b5f5f8cb2fa936809abb37e39845d12b06c95cd Apr 16 18:22:48.242991 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:48.242951 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xnggq" event={"ID":"7da343e6-5030-4252-8d63-0b49d5c3ff00","Type":"ContainerStarted","Data":"f9306309a8dbf4c48c4aca725b5f5f8cb2fa936809abb37e39845d12b06c95cd"} Apr 16 18:22:52.256801 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:52.256765 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xnggq" event={"ID":"7da343e6-5030-4252-8d63-0b49d5c3ff00","Type":"ContainerStarted","Data":"7bc92bcb793fed3e01ba5c4df8f75ea05e0ed4d2b42e4c564022aa649e4844b6"} Apr 16 18:22:52.272455 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:22:52.272411 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-xnggq" podStartSLOduration=1.214421654 podStartE2EDuration="5.272395643s" podCreationTimestamp="2026-04-16 18:22:47 +0000 UTC" firstStartedPulling="2026-04-16 18:22:47.996320512 +0000 UTC m=+369.503585606" lastFinishedPulling="2026-04-16 18:22:52.0542945 +0000 UTC m=+373.561559595" observedRunningTime="2026-04-16 18:22:52.271801505 +0000 UTC m=+373.779066621" watchObservedRunningTime="2026-04-16 18:22:52.272395643 +0000 UTC m=+373.779660758" Apr 16 18:23:39.833417 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:39.833379 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9"] Apr 16 18:23:39.836191 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:39.836174 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9" Apr 16 18:23:39.838788 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:39.838761 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:23:39.839820 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:39.839780 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:23:39.839820 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:39.839794 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-tn2js\"" Apr 16 18:23:39.847621 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:39.847595 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9"] Apr 16 18:23:39.879775 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:39.879746 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86bd5ed7-59c0-44e2-aba1-3ec9f15d0735-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9\" (UID: \"86bd5ed7-59c0-44e2-aba1-3ec9f15d0735\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9" Apr 16 18:23:39.879898 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:39.879785 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86bd5ed7-59c0-44e2-aba1-3ec9f15d0735-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9\" (UID: \"86bd5ed7-59c0-44e2-aba1-3ec9f15d0735\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9" Apr 16 18:23:39.879954 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:39.879891 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kftt6\" (UniqueName: \"kubernetes.io/projected/86bd5ed7-59c0-44e2-aba1-3ec9f15d0735-kube-api-access-kftt6\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9\" (UID: \"86bd5ed7-59c0-44e2-aba1-3ec9f15d0735\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9" Apr 16 18:23:39.980531 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:39.980489 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kftt6\" (UniqueName: \"kubernetes.io/projected/86bd5ed7-59c0-44e2-aba1-3ec9f15d0735-kube-api-access-kftt6\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9\" (UID: \"86bd5ed7-59c0-44e2-aba1-3ec9f15d0735\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9" Apr 16 18:23:39.980710 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:39.980564 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86bd5ed7-59c0-44e2-aba1-3ec9f15d0735-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9\" (UID: \"86bd5ed7-59c0-44e2-aba1-3ec9f15d0735\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9" Apr 16 18:23:39.980710 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:39.980601 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86bd5ed7-59c0-44e2-aba1-3ec9f15d0735-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9\" (UID: \"86bd5ed7-59c0-44e2-aba1-3ec9f15d0735\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9" Apr 16 18:23:39.980913 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:39.980893 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86bd5ed7-59c0-44e2-aba1-3ec9f15d0735-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9\" (UID: \"86bd5ed7-59c0-44e2-aba1-3ec9f15d0735\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9" Apr 16 18:23:39.980952 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:39.980934 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86bd5ed7-59c0-44e2-aba1-3ec9f15d0735-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9\" (UID: \"86bd5ed7-59c0-44e2-aba1-3ec9f15d0735\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9" Apr 16 18:23:39.988821 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:39.988797 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kftt6\" (UniqueName: \"kubernetes.io/projected/86bd5ed7-59c0-44e2-aba1-3ec9f15d0735-kube-api-access-kftt6\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9\" (UID: \"86bd5ed7-59c0-44e2-aba1-3ec9f15d0735\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9" Apr 16 18:23:40.144769 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:40.144728 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9" Apr 16 18:23:40.269966 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:40.269935 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9"] Apr 16 18:23:40.272276 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:23:40.272233 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86bd5ed7_59c0_44e2_aba1_3ec9f15d0735.slice/crio-c95121b65001d2eb9fc1fd11c01502726a544b225dd9e152c45f63b4154d4096 WatchSource:0}: Error finding container c95121b65001d2eb9fc1fd11c01502726a544b225dd9e152c45f63b4154d4096: Status 404 returned error can't find the container with id c95121b65001d2eb9fc1fd11c01502726a544b225dd9e152c45f63b4154d4096 Apr 16 18:23:40.389368 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:40.389334 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9" event={"ID":"86bd5ed7-59c0-44e2-aba1-3ec9f15d0735","Type":"ContainerStarted","Data":"c95121b65001d2eb9fc1fd11c01502726a544b225dd9e152c45f63b4154d4096"} Apr 16 18:23:45.407161 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:45.407126 2573 generic.go:358] "Generic (PLEG): container finished" podID="86bd5ed7-59c0-44e2-aba1-3ec9f15d0735" containerID="c5b7a4931591b3822a317aa4a7cf5877e1dfeadc009ff716737662b6e91e4727" exitCode=0 Apr 16 18:23:45.407598 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:45.407183 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9" event={"ID":"86bd5ed7-59c0-44e2-aba1-3ec9f15d0735","Type":"ContainerDied","Data":"c5b7a4931591b3822a317aa4a7cf5877e1dfeadc009ff716737662b6e91e4727"} Apr 16 18:23:48.417251 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:48.417203 2573 generic.go:358] "Generic (PLEG): container finished" podID="86bd5ed7-59c0-44e2-aba1-3ec9f15d0735" containerID="c42eaee159dd943ea09727bd3a7e13fde861fd1a5407bd720ee4c72741f7724f" exitCode=0 Apr 16 18:23:48.417669 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:48.417256 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9" event={"ID":"86bd5ed7-59c0-44e2-aba1-3ec9f15d0735","Type":"ContainerDied","Data":"c42eaee159dd943ea09727bd3a7e13fde861fd1a5407bd720ee4c72741f7724f"} Apr 16 18:23:54.436741 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:54.436705 2573 generic.go:358] "Generic (PLEG): container finished" podID="86bd5ed7-59c0-44e2-aba1-3ec9f15d0735" containerID="4fb509529935b0c9bd108d987831aca8af56650509c7021b1ea37fb3d69acc5e" exitCode=0 Apr 16 18:23:54.437113 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:54.436775 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9" event={"ID":"86bd5ed7-59c0-44e2-aba1-3ec9f15d0735","Type":"ContainerDied","Data":"4fb509529935b0c9bd108d987831aca8af56650509c7021b1ea37fb3d69acc5e"} Apr 16 18:23:55.554253 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:55.554229 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9" Apr 16 18:23:55.715088 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:55.715004 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86bd5ed7-59c0-44e2-aba1-3ec9f15d0735-bundle\") pod \"86bd5ed7-59c0-44e2-aba1-3ec9f15d0735\" (UID: \"86bd5ed7-59c0-44e2-aba1-3ec9f15d0735\") " Apr 16 18:23:55.715088 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:55.715061 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kftt6\" (UniqueName: \"kubernetes.io/projected/86bd5ed7-59c0-44e2-aba1-3ec9f15d0735-kube-api-access-kftt6\") pod \"86bd5ed7-59c0-44e2-aba1-3ec9f15d0735\" (UID: \"86bd5ed7-59c0-44e2-aba1-3ec9f15d0735\") " Apr 16 18:23:55.715297 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:55.715151 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86bd5ed7-59c0-44e2-aba1-3ec9f15d0735-util\") pod \"86bd5ed7-59c0-44e2-aba1-3ec9f15d0735\" (UID: \"86bd5ed7-59c0-44e2-aba1-3ec9f15d0735\") " Apr 16 18:23:55.715638 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:55.715614 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86bd5ed7-59c0-44e2-aba1-3ec9f15d0735-bundle" (OuterVolumeSpecName: "bundle") pod "86bd5ed7-59c0-44e2-aba1-3ec9f15d0735" (UID: "86bd5ed7-59c0-44e2-aba1-3ec9f15d0735"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:23:55.717199 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:55.717169 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86bd5ed7-59c0-44e2-aba1-3ec9f15d0735-kube-api-access-kftt6" (OuterVolumeSpecName: "kube-api-access-kftt6") pod "86bd5ed7-59c0-44e2-aba1-3ec9f15d0735" (UID: "86bd5ed7-59c0-44e2-aba1-3ec9f15d0735"). InnerVolumeSpecName "kube-api-access-kftt6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:23:55.719100 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:55.719078 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86bd5ed7-59c0-44e2-aba1-3ec9f15d0735-util" (OuterVolumeSpecName: "util") pod "86bd5ed7-59c0-44e2-aba1-3ec9f15d0735" (UID: "86bd5ed7-59c0-44e2-aba1-3ec9f15d0735"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:23:55.816041 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:55.815993 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86bd5ed7-59c0-44e2-aba1-3ec9f15d0735-util\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:23:55.816041 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:55.816035 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86bd5ed7-59c0-44e2-aba1-3ec9f15d0735-bundle\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:23:55.816041 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:55.816045 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kftt6\" (UniqueName: \"kubernetes.io/projected/86bd5ed7-59c0-44e2-aba1-3ec9f15d0735-kube-api-access-kftt6\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:23:56.443811 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:56.443777 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9" event={"ID":"86bd5ed7-59c0-44e2-aba1-3ec9f15d0735","Type":"ContainerDied","Data":"c95121b65001d2eb9fc1fd11c01502726a544b225dd9e152c45f63b4154d4096"} Apr 16 18:23:56.443811 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:56.443808 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c95121b65001d2eb9fc1fd11c01502726a544b225dd9e152c45f63b4154d4096" Apr 16 18:23:56.443811 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:23:56.443811 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctldq9" Apr 16 18:24:01.830680 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:01.830647 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hlqrx"] Apr 16 18:24:01.831058 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:01.830944 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86bd5ed7-59c0-44e2-aba1-3ec9f15d0735" containerName="extract" Apr 16 18:24:01.831058 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:01.830954 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="86bd5ed7-59c0-44e2-aba1-3ec9f15d0735" containerName="extract" Apr 16 18:24:01.831058 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:01.830968 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86bd5ed7-59c0-44e2-aba1-3ec9f15d0735" containerName="util" Apr 16 18:24:01.831058 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:01.830973 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="86bd5ed7-59c0-44e2-aba1-3ec9f15d0735" containerName="util" Apr 16 18:24:01.831058 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:01.830985 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86bd5ed7-59c0-44e2-aba1-3ec9f15d0735" containerName="pull" Apr 16 18:24:01.831058 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:01.830990 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="86bd5ed7-59c0-44e2-aba1-3ec9f15d0735" containerName="pull" Apr 16 18:24:01.831058 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:01.831040 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="86bd5ed7-59c0-44e2-aba1-3ec9f15d0735" containerName="extract" Apr 16 18:24:01.876796 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:01.876760 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hlqrx"] Apr 16 18:24:01.876796 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:01.876789 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hlqrx" Apr 16 18:24:01.879515 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:01.879497 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 18:24:01.879779 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:01.879753 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 18:24:01.879779 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:01.879768 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-fd95n\"" Apr 16 18:24:01.879913 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:01.879817 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 18:24:01.964384 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:01.964347 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7tsw\" (UniqueName: \"kubernetes.io/projected/6e635122-49ea-474d-bb81-435372c4eb58-kube-api-access-k7tsw\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hlqrx\" (UID: \"6e635122-49ea-474d-bb81-435372c4eb58\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hlqrx" Apr 16 18:24:01.964573 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:01.964407 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/6e635122-49ea-474d-bb81-435372c4eb58-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hlqrx\" (UID: \"6e635122-49ea-474d-bb81-435372c4eb58\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hlqrx" Apr 16 18:24:02.065652 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:02.065607 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/6e635122-49ea-474d-bb81-435372c4eb58-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hlqrx\" (UID: \"6e635122-49ea-474d-bb81-435372c4eb58\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hlqrx" Apr 16 18:24:02.065817 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:02.065707 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7tsw\" (UniqueName: \"kubernetes.io/projected/6e635122-49ea-474d-bb81-435372c4eb58-kube-api-access-k7tsw\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hlqrx\" (UID: \"6e635122-49ea-474d-bb81-435372c4eb58\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hlqrx" Apr 16 18:24:02.067886 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:02.067862 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/6e635122-49ea-474d-bb81-435372c4eb58-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hlqrx\" (UID: \"6e635122-49ea-474d-bb81-435372c4eb58\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hlqrx" Apr 16 18:24:02.075017 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:02.074995 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7tsw\" (UniqueName: \"kubernetes.io/projected/6e635122-49ea-474d-bb81-435372c4eb58-kube-api-access-k7tsw\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hlqrx\" (UID: \"6e635122-49ea-474d-bb81-435372c4eb58\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hlqrx" Apr 16 18:24:02.186295 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:02.186267 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hlqrx" Apr 16 18:24:02.303093 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:02.303066 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hlqrx"] Apr 16 18:24:02.305527 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:24:02.305500 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e635122_49ea_474d_bb81_435372c4eb58.slice/crio-cb5a4366d2ab27e2e5e225b51c35931b150f94af0077d331bf1088b85cfe9142 WatchSource:0}: Error finding container cb5a4366d2ab27e2e5e225b51c35931b150f94af0077d331bf1088b85cfe9142: Status 404 returned error can't find the container with id cb5a4366d2ab27e2e5e225b51c35931b150f94af0077d331bf1088b85cfe9142 Apr 16 18:24:02.463535 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:02.463436 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hlqrx" event={"ID":"6e635122-49ea-474d-bb81-435372c4eb58","Type":"ContainerStarted","Data":"cb5a4366d2ab27e2e5e225b51c35931b150f94af0077d331bf1088b85cfe9142"} Apr 16 18:24:06.474247 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:06.474213 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-ghmht"] Apr 16 18:24:06.500594 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:06.500558 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-ghmht"] Apr 16 18:24:06.500594 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:06.500590 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-ghmht" Apr 16 18:24:06.500777 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:06.500602 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hlqrx" event={"ID":"6e635122-49ea-474d-bb81-435372c4eb58","Type":"ContainerStarted","Data":"87cafb88ea72e0aff2f313be4bbcc977ae1c0093334e6bec6785167f7cdd7605"} Apr 16 18:24:06.500829 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:06.500795 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hlqrx" Apr 16 18:24:06.503420 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:06.503398 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 18:24:06.503782 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:06.503763 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 18:24:06.503885 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:06.503800 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-7f2q2\"" Apr 16 18:24:06.525984 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:06.525940 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hlqrx" podStartSLOduration=1.8962091760000002 podStartE2EDuration="5.525925406s" podCreationTimestamp="2026-04-16 18:24:01 +0000 UTC" firstStartedPulling="2026-04-16 18:24:02.307126149 +0000 UTC m=+443.814391243" lastFinishedPulling="2026-04-16 18:24:05.936842379 +0000 UTC m=+447.444107473" observedRunningTime="2026-04-16 18:24:06.524043882 +0000 UTC m=+448.031309030" watchObservedRunningTime="2026-04-16 18:24:06.525925406 +0000 UTC m=+448.033190521" Apr 16 18:24:06.607039 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:06.607000 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/87f85b29-4228-4a2e-9bdd-0e08f81fbda4-certificates\") pod \"keda-operator-ffbb595cb-ghmht\" (UID: \"87f85b29-4228-4a2e-9bdd-0e08f81fbda4\") " pod="openshift-keda/keda-operator-ffbb595cb-ghmht" Apr 16 18:24:06.607228 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:06.607051 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/87f85b29-4228-4a2e-9bdd-0e08f81fbda4-cabundle0\") pod \"keda-operator-ffbb595cb-ghmht\" (UID: \"87f85b29-4228-4a2e-9bdd-0e08f81fbda4\") " pod="openshift-keda/keda-operator-ffbb595cb-ghmht" Apr 16 18:24:06.607228 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:06.607082 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brndt\" (UniqueName: \"kubernetes.io/projected/87f85b29-4228-4a2e-9bdd-0e08f81fbda4-kube-api-access-brndt\") pod \"keda-operator-ffbb595cb-ghmht\" (UID: \"87f85b29-4228-4a2e-9bdd-0e08f81fbda4\") " pod="openshift-keda/keda-operator-ffbb595cb-ghmht" Apr 16 18:24:06.707888 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:06.707854 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/87f85b29-4228-4a2e-9bdd-0e08f81fbda4-certificates\") pod \"keda-operator-ffbb595cb-ghmht\" (UID: \"87f85b29-4228-4a2e-9bdd-0e08f81fbda4\") " pod="openshift-keda/keda-operator-ffbb595cb-ghmht" Apr 16 18:24:06.707888 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:06.707899 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/87f85b29-4228-4a2e-9bdd-0e08f81fbda4-cabundle0\") pod \"keda-operator-ffbb595cb-ghmht\" (UID: \"87f85b29-4228-4a2e-9bdd-0e08f81fbda4\") " pod="openshift-keda/keda-operator-ffbb595cb-ghmht" Apr 16 18:24:06.708098 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:06.707923 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brndt\" (UniqueName: \"kubernetes.io/projected/87f85b29-4228-4a2e-9bdd-0e08f81fbda4-kube-api-access-brndt\") pod \"keda-operator-ffbb595cb-ghmht\" (UID: \"87f85b29-4228-4a2e-9bdd-0e08f81fbda4\") " pod="openshift-keda/keda-operator-ffbb595cb-ghmht" Apr 16 18:24:06.708098 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:06.708002 2573 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:24:06.708098 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:06.708021 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:24:06.708098 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:06.708029 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-ghmht: references non-existent secret key: ca.crt Apr 16 18:24:06.708098 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:06.708082 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/87f85b29-4228-4a2e-9bdd-0e08f81fbda4-certificates podName:87f85b29-4228-4a2e-9bdd-0e08f81fbda4 nodeName:}" failed. No retries permitted until 2026-04-16 18:24:07.208065523 +0000 UTC m=+448.715330617 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/87f85b29-4228-4a2e-9bdd-0e08f81fbda4-certificates") pod "keda-operator-ffbb595cb-ghmht" (UID: "87f85b29-4228-4a2e-9bdd-0e08f81fbda4") : references non-existent secret key: ca.crt Apr 16 18:24:06.708582 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:06.708565 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/87f85b29-4228-4a2e-9bdd-0e08f81fbda4-cabundle0\") pod \"keda-operator-ffbb595cb-ghmht\" (UID: \"87f85b29-4228-4a2e-9bdd-0e08f81fbda4\") " pod="openshift-keda/keda-operator-ffbb595cb-ghmht" Apr 16 18:24:06.729957 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:06.729897 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brndt\" (UniqueName: \"kubernetes.io/projected/87f85b29-4228-4a2e-9bdd-0e08f81fbda4-kube-api-access-brndt\") pod \"keda-operator-ffbb595cb-ghmht\" (UID: \"87f85b29-4228-4a2e-9bdd-0e08f81fbda4\") " pod="openshift-keda/keda-operator-ffbb595cb-ghmht" Apr 16 18:24:06.809122 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:06.809083 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t"] Apr 16 18:24:06.831473 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:06.831443 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t"] Apr 16 18:24:06.831630 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:06.831602 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t" Apr 16 18:24:06.834790 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:06.834768 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 18:24:07.011351 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:07.011259 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hdtz\" (UniqueName: \"kubernetes.io/projected/7486bc14-c6af-4f6f-b812-931b91003996-kube-api-access-8hdtz\") pod \"keda-metrics-apiserver-7c9f485588-ghs5t\" (UID: \"7486bc14-c6af-4f6f-b812-931b91003996\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t" Apr 16 18:24:07.011351 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:07.011329 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7486bc14-c6af-4f6f-b812-931b91003996-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ghs5t\" (UID: \"7486bc14-c6af-4f6f-b812-931b91003996\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t" Apr 16 18:24:07.011585 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:07.011384 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/7486bc14-c6af-4f6f-b812-931b91003996-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-ghs5t\" (UID: \"7486bc14-c6af-4f6f-b812-931b91003996\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t" Apr 16 18:24:07.064261 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:07.064227 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-s5r4b"] Apr 16 18:24:07.084960 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:07.084927 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-s5r4b"] Apr 16 18:24:07.085109 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:07.085056 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-s5r4b" Apr 16 18:24:07.088102 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:07.088081 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 18:24:07.112100 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:07.112069 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/7486bc14-c6af-4f6f-b812-931b91003996-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-ghs5t\" (UID: \"7486bc14-c6af-4f6f-b812-931b91003996\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t" Apr 16 18:24:07.112243 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:07.112151 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hdtz\" (UniqueName: \"kubernetes.io/projected/7486bc14-c6af-4f6f-b812-931b91003996-kube-api-access-8hdtz\") pod \"keda-metrics-apiserver-7c9f485588-ghs5t\" (UID: \"7486bc14-c6af-4f6f-b812-931b91003996\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t" Apr 16 18:24:07.112243 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:07.112205 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7486bc14-c6af-4f6f-b812-931b91003996-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ghs5t\" (UID: \"7486bc14-c6af-4f6f-b812-931b91003996\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t" Apr 16 18:24:07.112362 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:07.112313 2573 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:24:07.112362 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:07.112328 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:24:07.112362 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:07.112345 2573 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 16 18:24:07.112362 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:07.112364 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 18:24:07.112540 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:07.112427 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7486bc14-c6af-4f6f-b812-931b91003996-certificates podName:7486bc14-c6af-4f6f-b812-931b91003996 nodeName:}" failed. No retries permitted until 2026-04-16 18:24:07.612410197 +0000 UTC m=+449.119675304 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7486bc14-c6af-4f6f-b812-931b91003996-certificates") pod "keda-metrics-apiserver-7c9f485588-ghs5t" (UID: "7486bc14-c6af-4f6f-b812-931b91003996") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 18:24:07.112540 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:07.112470 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/7486bc14-c6af-4f6f-b812-931b91003996-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-ghs5t\" (UID: \"7486bc14-c6af-4f6f-b812-931b91003996\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t" Apr 16 18:24:07.123421 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:07.123391 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hdtz\" (UniqueName: \"kubernetes.io/projected/7486bc14-c6af-4f6f-b812-931b91003996-kube-api-access-8hdtz\") pod \"keda-metrics-apiserver-7c9f485588-ghs5t\" (UID: \"7486bc14-c6af-4f6f-b812-931b91003996\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t" Apr 16 18:24:07.213768 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:07.213721 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/203e20a7-ed68-4784-b5f3-87a90e0abf0c-certificates\") pod \"keda-admission-cf49989db-s5r4b\" (UID: \"203e20a7-ed68-4784-b5f3-87a90e0abf0c\") " pod="openshift-keda/keda-admission-cf49989db-s5r4b" Apr 16 18:24:07.213946 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:07.213787 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/87f85b29-4228-4a2e-9bdd-0e08f81fbda4-certificates\") pod \"keda-operator-ffbb595cb-ghmht\" (UID: \"87f85b29-4228-4a2e-9bdd-0e08f81fbda4\") " pod="openshift-keda/keda-operator-ffbb595cb-ghmht" Apr 16 18:24:07.213946 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:07.213844 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sjk7\" (UniqueName: \"kubernetes.io/projected/203e20a7-ed68-4784-b5f3-87a90e0abf0c-kube-api-access-5sjk7\") pod \"keda-admission-cf49989db-s5r4b\" (UID: \"203e20a7-ed68-4784-b5f3-87a90e0abf0c\") " pod="openshift-keda/keda-admission-cf49989db-s5r4b" Apr 16 18:24:07.214061 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:07.213990 2573 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:24:07.214061 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:07.214005 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:24:07.214061 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:07.214016 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-ghmht: references non-existent secret key: ca.crt Apr 16 18:24:07.214190 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:07.214067 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/87f85b29-4228-4a2e-9bdd-0e08f81fbda4-certificates podName:87f85b29-4228-4a2e-9bdd-0e08f81fbda4 nodeName:}" failed. No retries permitted until 2026-04-16 18:24:08.214047904 +0000 UTC m=+449.721313002 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/87f85b29-4228-4a2e-9bdd-0e08f81fbda4-certificates") pod "keda-operator-ffbb595cb-ghmht" (UID: "87f85b29-4228-4a2e-9bdd-0e08f81fbda4") : references non-existent secret key: ca.crt Apr 16 18:24:07.315238 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:07.315151 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5sjk7\" (UniqueName: \"kubernetes.io/projected/203e20a7-ed68-4784-b5f3-87a90e0abf0c-kube-api-access-5sjk7\") pod \"keda-admission-cf49989db-s5r4b\" (UID: \"203e20a7-ed68-4784-b5f3-87a90e0abf0c\") " pod="openshift-keda/keda-admission-cf49989db-s5r4b" Apr 16 18:24:07.315394 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:07.315246 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/203e20a7-ed68-4784-b5f3-87a90e0abf0c-certificates\") pod \"keda-admission-cf49989db-s5r4b\" (UID: \"203e20a7-ed68-4784-b5f3-87a90e0abf0c\") " pod="openshift-keda/keda-admission-cf49989db-s5r4b" Apr 16 18:24:07.317752 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:07.317728 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/203e20a7-ed68-4784-b5f3-87a90e0abf0c-certificates\") pod \"keda-admission-cf49989db-s5r4b\" (UID: \"203e20a7-ed68-4784-b5f3-87a90e0abf0c\") " pod="openshift-keda/keda-admission-cf49989db-s5r4b" Apr 16 18:24:07.323075 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:07.323050 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sjk7\" (UniqueName: \"kubernetes.io/projected/203e20a7-ed68-4784-b5f3-87a90e0abf0c-kube-api-access-5sjk7\") pod \"keda-admission-cf49989db-s5r4b\" (UID: \"203e20a7-ed68-4784-b5f3-87a90e0abf0c\") " pod="openshift-keda/keda-admission-cf49989db-s5r4b" Apr 16 18:24:07.394956 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:07.394923 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-s5r4b" Apr 16 18:24:07.533896 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:07.533846 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-s5r4b"] Apr 16 18:24:07.536096 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:24:07.536069 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod203e20a7_ed68_4784_b5f3_87a90e0abf0c.slice/crio-755f4c223ac7e2c467ec58132605a1358e2fdd230a4e78f96a2812d4ab7f8ad0 WatchSource:0}: Error finding container 755f4c223ac7e2c467ec58132605a1358e2fdd230a4e78f96a2812d4ab7f8ad0: Status 404 returned error can't find the container with id 755f4c223ac7e2c467ec58132605a1358e2fdd230a4e78f96a2812d4ab7f8ad0 Apr 16 18:24:07.617858 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:07.617750 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7486bc14-c6af-4f6f-b812-931b91003996-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ghs5t\" (UID: \"7486bc14-c6af-4f6f-b812-931b91003996\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t" Apr 16 18:24:07.618036 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:07.618013 2573 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:24:07.618036 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:07.618036 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:24:07.618182 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:07.618057 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t: references non-existent secret key: tls.crt Apr 16 18:24:07.618182 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:07.618110 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7486bc14-c6af-4f6f-b812-931b91003996-certificates podName:7486bc14-c6af-4f6f-b812-931b91003996 nodeName:}" failed. No retries permitted until 2026-04-16 18:24:08.618093617 +0000 UTC m=+450.125358724 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7486bc14-c6af-4f6f-b812-931b91003996-certificates") pod "keda-metrics-apiserver-7c9f485588-ghs5t" (UID: "7486bc14-c6af-4f6f-b812-931b91003996") : references non-existent secret key: tls.crt Apr 16 18:24:08.223316 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:08.223278 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/87f85b29-4228-4a2e-9bdd-0e08f81fbda4-certificates\") pod \"keda-operator-ffbb595cb-ghmht\" (UID: \"87f85b29-4228-4a2e-9bdd-0e08f81fbda4\") " pod="openshift-keda/keda-operator-ffbb595cb-ghmht" Apr 16 18:24:08.223490 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:08.223434 2573 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:24:08.223490 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:08.223453 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:24:08.223490 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:08.223465 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-ghmht: references non-existent secret key: ca.crt Apr 16 18:24:08.223699 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:08.223523 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/87f85b29-4228-4a2e-9bdd-0e08f81fbda4-certificates podName:87f85b29-4228-4a2e-9bdd-0e08f81fbda4 nodeName:}" failed. No retries permitted until 2026-04-16 18:24:10.223507503 +0000 UTC m=+451.730772617 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/87f85b29-4228-4a2e-9bdd-0e08f81fbda4-certificates") pod "keda-operator-ffbb595cb-ghmht" (UID: "87f85b29-4228-4a2e-9bdd-0e08f81fbda4") : references non-existent secret key: ca.crt Apr 16 18:24:08.485569 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:08.485479 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-s5r4b" event={"ID":"203e20a7-ed68-4784-b5f3-87a90e0abf0c","Type":"ContainerStarted","Data":"755f4c223ac7e2c467ec58132605a1358e2fdd230a4e78f96a2812d4ab7f8ad0"} Apr 16 18:24:08.626656 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:08.626618 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7486bc14-c6af-4f6f-b812-931b91003996-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ghs5t\" (UID: \"7486bc14-c6af-4f6f-b812-931b91003996\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t" Apr 16 18:24:08.627046 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:08.626767 2573 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:24:08.627046 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:08.626785 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:24:08.627046 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:08.626806 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t: references non-existent secret key: tls.crt Apr 16 18:24:08.627046 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:08.626870 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7486bc14-c6af-4f6f-b812-931b91003996-certificates podName:7486bc14-c6af-4f6f-b812-931b91003996 nodeName:}" failed. No retries permitted until 2026-04-16 18:24:10.626850729 +0000 UTC m=+452.134115826 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7486bc14-c6af-4f6f-b812-931b91003996-certificates") pod "keda-metrics-apiserver-7c9f485588-ghs5t" (UID: "7486bc14-c6af-4f6f-b812-931b91003996") : references non-existent secret key: tls.crt Apr 16 18:24:09.489610 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:09.489579 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-s5r4b" event={"ID":"203e20a7-ed68-4784-b5f3-87a90e0abf0c","Type":"ContainerStarted","Data":"a03b091904b15c1f3900e2d9a038e44c4ae163bac0b8d42be3d59fd48f133a96"} Apr 16 18:24:09.489795 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:09.489704 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-s5r4b" Apr 16 18:24:09.513783 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:09.513737 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-s5r4b" podStartSLOduration=1.253708353 podStartE2EDuration="2.513723509s" podCreationTimestamp="2026-04-16 18:24:07 +0000 UTC" firstStartedPulling="2026-04-16 18:24:07.537238272 +0000 UTC m=+449.044503367" lastFinishedPulling="2026-04-16 18:24:08.797253428 +0000 UTC m=+450.304518523" observedRunningTime="2026-04-16 18:24:09.512317279 +0000 UTC m=+451.019582408" watchObservedRunningTime="2026-04-16 18:24:09.513723509 +0000 UTC m=+451.020988623" Apr 16 18:24:10.241307 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:10.241258 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/87f85b29-4228-4a2e-9bdd-0e08f81fbda4-certificates\") pod \"keda-operator-ffbb595cb-ghmht\" (UID: \"87f85b29-4228-4a2e-9bdd-0e08f81fbda4\") " pod="openshift-keda/keda-operator-ffbb595cb-ghmht" Apr 16 18:24:10.241701 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:10.241388 2573 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:24:10.241701 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:10.241400 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:24:10.241701 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:10.241408 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-ghmht: references non-existent secret key: ca.crt Apr 16 18:24:10.241701 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:10.241453 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/87f85b29-4228-4a2e-9bdd-0e08f81fbda4-certificates podName:87f85b29-4228-4a2e-9bdd-0e08f81fbda4 nodeName:}" failed. No retries permitted until 2026-04-16 18:24:14.241441809 +0000 UTC m=+455.748706903 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/87f85b29-4228-4a2e-9bdd-0e08f81fbda4-certificates") pod "keda-operator-ffbb595cb-ghmht" (UID: "87f85b29-4228-4a2e-9bdd-0e08f81fbda4") : references non-existent secret key: ca.crt Apr 16 18:24:10.645140 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:10.645112 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7486bc14-c6af-4f6f-b812-931b91003996-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ghs5t\" (UID: \"7486bc14-c6af-4f6f-b812-931b91003996\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t" Apr 16 18:24:10.645309 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:10.645294 2573 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:24:10.645384 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:10.645313 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:24:10.645384 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:10.645344 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t: references non-existent secret key: tls.crt Apr 16 18:24:10.645474 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:24:10.645418 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7486bc14-c6af-4f6f-b812-931b91003996-certificates podName:7486bc14-c6af-4f6f-b812-931b91003996 nodeName:}" failed. No retries permitted until 2026-04-16 18:24:14.645399159 +0000 UTC m=+456.152664255 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7486bc14-c6af-4f6f-b812-931b91003996-certificates") pod "keda-metrics-apiserver-7c9f485588-ghs5t" (UID: "7486bc14-c6af-4f6f-b812-931b91003996") : references non-existent secret key: tls.crt Apr 16 18:24:14.275965 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:14.275925 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/87f85b29-4228-4a2e-9bdd-0e08f81fbda4-certificates\") pod \"keda-operator-ffbb595cb-ghmht\" (UID: \"87f85b29-4228-4a2e-9bdd-0e08f81fbda4\") " pod="openshift-keda/keda-operator-ffbb595cb-ghmht" Apr 16 18:24:14.278266 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:14.278244 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/87f85b29-4228-4a2e-9bdd-0e08f81fbda4-certificates\") pod \"keda-operator-ffbb595cb-ghmht\" (UID: \"87f85b29-4228-4a2e-9bdd-0e08f81fbda4\") " pod="openshift-keda/keda-operator-ffbb595cb-ghmht" Apr 16 18:24:14.311824 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:14.311788 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-ghmht" Apr 16 18:24:14.429071 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:14.429047 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-ghmht"] Apr 16 18:24:14.431579 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:24:14.431535 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87f85b29_4228_4a2e_9bdd_0e08f81fbda4.slice/crio-d50a3745bd3365fca1beda8f74e472b95feb34005e6a19e8ad2672963da04a83 WatchSource:0}: Error finding container d50a3745bd3365fca1beda8f74e472b95feb34005e6a19e8ad2672963da04a83: Status 404 returned error can't find the container with id d50a3745bd3365fca1beda8f74e472b95feb34005e6a19e8ad2672963da04a83 Apr 16 18:24:14.508131 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:14.508093 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-ghmht" event={"ID":"87f85b29-4228-4a2e-9bdd-0e08f81fbda4","Type":"ContainerStarted","Data":"d50a3745bd3365fca1beda8f74e472b95feb34005e6a19e8ad2672963da04a83"} Apr 16 18:24:14.679856 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:14.679818 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7486bc14-c6af-4f6f-b812-931b91003996-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ghs5t\" (UID: \"7486bc14-c6af-4f6f-b812-931b91003996\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t" Apr 16 18:24:14.682263 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:14.682240 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7486bc14-c6af-4f6f-b812-931b91003996-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ghs5t\" (UID: \"7486bc14-c6af-4f6f-b812-931b91003996\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t" Apr 16 18:24:14.941930 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:14.941892 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t" Apr 16 18:24:15.059858 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:15.059831 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t"] Apr 16 18:24:15.061552 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:24:15.061512 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7486bc14_c6af_4f6f_b812_931b91003996.slice/crio-1aba40c683dac5e486894eafd651df3de27e99ee6ad139efdf98b44b3c20c165 WatchSource:0}: Error finding container 1aba40c683dac5e486894eafd651df3de27e99ee6ad139efdf98b44b3c20c165: Status 404 returned error can't find the container with id 1aba40c683dac5e486894eafd651df3de27e99ee6ad139efdf98b44b3c20c165 Apr 16 18:24:15.511835 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:15.511801 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t" event={"ID":"7486bc14-c6af-4f6f-b812-931b91003996","Type":"ContainerStarted","Data":"1aba40c683dac5e486894eafd651df3de27e99ee6ad139efdf98b44b3c20c165"} Apr 16 18:24:17.520404 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:17.520366 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t" event={"ID":"7486bc14-c6af-4f6f-b812-931b91003996","Type":"ContainerStarted","Data":"24206f20b8eb66138d13cf7b847be0db879e8aafc38a17961337f3f3dd284e13"} Apr 16 18:24:17.520842 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:17.520574 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t" Apr 16 18:24:17.551824 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:17.551779 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t" podStartSLOduration=9.286083688 podStartE2EDuration="11.551764158s" podCreationTimestamp="2026-04-16 18:24:06 +0000 UTC" firstStartedPulling="2026-04-16 18:24:15.062892208 +0000 UTC m=+456.570157306" lastFinishedPulling="2026-04-16 18:24:17.328572678 +0000 UTC m=+458.835837776" observedRunningTime="2026-04-16 18:24:17.549708035 +0000 UTC m=+459.056973151" watchObservedRunningTime="2026-04-16 18:24:17.551764158 +0000 UTC m=+459.059029307" Apr 16 18:24:27.483527 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:27.483421 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hlqrx" Apr 16 18:24:28.531377 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:28.531346 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ghs5t" Apr 16 18:24:29.567927 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:29.567895 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-ghmht" event={"ID":"87f85b29-4228-4a2e-9bdd-0e08f81fbda4","Type":"ContainerStarted","Data":"bcade117ddc1f55788592c9cf7ec4da56079733b9b1495fe3f687d076a9061a8"} Apr 16 18:24:29.568298 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:29.568061 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-ghmht" Apr 16 18:24:29.586480 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:29.586436 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-ghmht" podStartSLOduration=9.084674604 podStartE2EDuration="23.586425087s" podCreationTimestamp="2026-04-16 18:24:06 +0000 UTC" firstStartedPulling="2026-04-16 18:24:14.432846553 +0000 UTC m=+455.940111648" lastFinishedPulling="2026-04-16 18:24:28.934597032 +0000 UTC m=+470.441862131" observedRunningTime="2026-04-16 18:24:29.584216397 +0000 UTC m=+471.091481514" watchObservedRunningTime="2026-04-16 18:24:29.586425087 +0000 UTC m=+471.093690202" Apr 16 18:24:30.497323 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:30.497291 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-s5r4b" Apr 16 18:24:50.573973 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:24:50.573941 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-ghmht" Apr 16 18:25:17.999349 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:25:17.999309 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-x742d"] Apr 16 18:25:18.006989 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:25:18.006962 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-x742d" Apr 16 18:25:18.009706 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:25:18.009673 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:25:18.010795 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:25:18.010772 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-x6l9j\"" Apr 16 18:25:18.011221 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:25:18.011206 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 18:25:18.012027 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:25:18.012008 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:25:18.013776 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:25:18.013747 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-x742d"] Apr 16 18:25:18.198382 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:25:18.198343 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fs7n\" (UniqueName: \"kubernetes.io/projected/ac39fd98-705a-44a6-8aa8-16a419e83ada-kube-api-access-7fs7n\") pod \"llmisvc-controller-manager-68cc5db7c4-x742d\" (UID: \"ac39fd98-705a-44a6-8aa8-16a419e83ada\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-x742d" Apr 16 18:25:18.198596 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:25:18.198401 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac39fd98-705a-44a6-8aa8-16a419e83ada-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-x742d\" (UID: \"ac39fd98-705a-44a6-8aa8-16a419e83ada\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-x742d" Apr 16 18:25:18.299761 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:25:18.299678 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fs7n\" (UniqueName: \"kubernetes.io/projected/ac39fd98-705a-44a6-8aa8-16a419e83ada-kube-api-access-7fs7n\") pod \"llmisvc-controller-manager-68cc5db7c4-x742d\" (UID: \"ac39fd98-705a-44a6-8aa8-16a419e83ada\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-x742d" Apr 16 18:25:18.299761 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:25:18.299727 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac39fd98-705a-44a6-8aa8-16a419e83ada-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-x742d\" (UID: \"ac39fd98-705a-44a6-8aa8-16a419e83ada\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-x742d" Apr 16 18:25:18.299953 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:25:18.299897 2573 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 16 18:25:18.299987 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:25:18.299962 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac39fd98-705a-44a6-8aa8-16a419e83ada-cert podName:ac39fd98-705a-44a6-8aa8-16a419e83ada nodeName:}" failed. No retries permitted until 2026-04-16 18:25:18.799944417 +0000 UTC m=+520.307209512 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ac39fd98-705a-44a6-8aa8-16a419e83ada-cert") pod "llmisvc-controller-manager-68cc5db7c4-x742d" (UID: "ac39fd98-705a-44a6-8aa8-16a419e83ada") : secret "llmisvc-webhook-server-cert" not found Apr 16 18:25:18.310463 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:25:18.310430 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fs7n\" (UniqueName: \"kubernetes.io/projected/ac39fd98-705a-44a6-8aa8-16a419e83ada-kube-api-access-7fs7n\") pod \"llmisvc-controller-manager-68cc5db7c4-x742d\" (UID: \"ac39fd98-705a-44a6-8aa8-16a419e83ada\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-x742d" Apr 16 18:25:18.805153 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:25:18.805118 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac39fd98-705a-44a6-8aa8-16a419e83ada-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-x742d\" (UID: \"ac39fd98-705a-44a6-8aa8-16a419e83ada\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-x742d" Apr 16 18:25:18.807637 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:25:18.807607 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac39fd98-705a-44a6-8aa8-16a419e83ada-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-x742d\" (UID: \"ac39fd98-705a-44a6-8aa8-16a419e83ada\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-x742d" Apr 16 18:25:18.919851 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:25:18.919812 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-x742d" Apr 16 18:25:19.043154 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:25:19.043127 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-x742d"] Apr 16 18:25:19.045788 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:25:19.045753 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podac39fd98_705a_44a6_8aa8_16a419e83ada.slice/crio-701d8dbdbb08ca3b6408644cc7e2bb8708fe51807cec007d5aa7a280aadaee13 WatchSource:0}: Error finding container 701d8dbdbb08ca3b6408644cc7e2bb8708fe51807cec007d5aa7a280aadaee13: Status 404 returned error can't find the container with id 701d8dbdbb08ca3b6408644cc7e2bb8708fe51807cec007d5aa7a280aadaee13 Apr 16 18:25:19.727256 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:25:19.727217 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-x742d" event={"ID":"ac39fd98-705a-44a6-8aa8-16a419e83ada","Type":"ContainerStarted","Data":"701d8dbdbb08ca3b6408644cc7e2bb8708fe51807cec007d5aa7a280aadaee13"} Apr 16 18:25:21.735197 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:25:21.735163 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-x742d" event={"ID":"ac39fd98-705a-44a6-8aa8-16a419e83ada","Type":"ContainerStarted","Data":"35bcf85290bcebae8e9a1ec081b65d7e646e711194b5577b548f96e7ed9f1321"} Apr 16 18:25:21.735598 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:25:21.735386 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-x742d" Apr 16 18:25:21.753251 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:25:21.753200 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-x742d" podStartSLOduration=2.807917584 podStartE2EDuration="4.753182445s" podCreationTimestamp="2026-04-16 18:25:17 +0000 UTC" firstStartedPulling="2026-04-16 18:25:19.046648074 +0000 UTC m=+520.553913168" lastFinishedPulling="2026-04-16 18:25:20.991912923 +0000 UTC m=+522.499178029" observedRunningTime="2026-04-16 18:25:21.751423037 +0000 UTC m=+523.258688152" watchObservedRunningTime="2026-04-16 18:25:21.753182445 +0000 UTC m=+523.260447560" Apr 16 18:25:52.740792 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:25:52.740759 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-x742d" Apr 16 18:26:22.811880 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.811844 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8f89b9d4c-5svtt"] Apr 16 18:26:22.818156 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.818132 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:22.825052 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.824716 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8f89b9d4c-5svtt"] Apr 16 18:26:22.844200 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.844169 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb1d30f8-6006-4ff5-9f54-8076e4677c3a-console-serving-cert\") pod \"console-8f89b9d4c-5svtt\" (UID: \"bb1d30f8-6006-4ff5-9f54-8076e4677c3a\") " pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:22.844200 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.844204 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fjvs\" (UniqueName: \"kubernetes.io/projected/bb1d30f8-6006-4ff5-9f54-8076e4677c3a-kube-api-access-9fjvs\") pod \"console-8f89b9d4c-5svtt\" (UID: \"bb1d30f8-6006-4ff5-9f54-8076e4677c3a\") " pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:22.844439 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.844263 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bb1d30f8-6006-4ff5-9f54-8076e4677c3a-oauth-serving-cert\") pod \"console-8f89b9d4c-5svtt\" (UID: \"bb1d30f8-6006-4ff5-9f54-8076e4677c3a\") " pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:22.844439 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.844322 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bb1d30f8-6006-4ff5-9f54-8076e4677c3a-console-config\") pod \"console-8f89b9d4c-5svtt\" (UID: \"bb1d30f8-6006-4ff5-9f54-8076e4677c3a\") " pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:22.844439 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.844410 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb1d30f8-6006-4ff5-9f54-8076e4677c3a-trusted-ca-bundle\") pod \"console-8f89b9d4c-5svtt\" (UID: \"bb1d30f8-6006-4ff5-9f54-8076e4677c3a\") " pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:22.844534 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.844441 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bb1d30f8-6006-4ff5-9f54-8076e4677c3a-service-ca\") pod \"console-8f89b9d4c-5svtt\" (UID: \"bb1d30f8-6006-4ff5-9f54-8076e4677c3a\") " pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:22.844534 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.844463 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bb1d30f8-6006-4ff5-9f54-8076e4677c3a-console-oauth-config\") pod \"console-8f89b9d4c-5svtt\" (UID: \"bb1d30f8-6006-4ff5-9f54-8076e4677c3a\") " pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:22.945744 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.945694 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb1d30f8-6006-4ff5-9f54-8076e4677c3a-trusted-ca-bundle\") pod \"console-8f89b9d4c-5svtt\" (UID: \"bb1d30f8-6006-4ff5-9f54-8076e4677c3a\") " pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:22.945744 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.945744 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bb1d30f8-6006-4ff5-9f54-8076e4677c3a-service-ca\") pod \"console-8f89b9d4c-5svtt\" (UID: \"bb1d30f8-6006-4ff5-9f54-8076e4677c3a\") " pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:22.945996 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.945772 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bb1d30f8-6006-4ff5-9f54-8076e4677c3a-console-oauth-config\") pod \"console-8f89b9d4c-5svtt\" (UID: \"bb1d30f8-6006-4ff5-9f54-8076e4677c3a\") " pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:22.945996 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.945791 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb1d30f8-6006-4ff5-9f54-8076e4677c3a-console-serving-cert\") pod \"console-8f89b9d4c-5svtt\" (UID: \"bb1d30f8-6006-4ff5-9f54-8076e4677c3a\") " pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:22.945996 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.945807 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fjvs\" (UniqueName: \"kubernetes.io/projected/bb1d30f8-6006-4ff5-9f54-8076e4677c3a-kube-api-access-9fjvs\") pod \"console-8f89b9d4c-5svtt\" (UID: \"bb1d30f8-6006-4ff5-9f54-8076e4677c3a\") " pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:22.945996 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.945845 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bb1d30f8-6006-4ff5-9f54-8076e4677c3a-oauth-serving-cert\") pod \"console-8f89b9d4c-5svtt\" (UID: \"bb1d30f8-6006-4ff5-9f54-8076e4677c3a\") " pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:22.945996 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.945917 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bb1d30f8-6006-4ff5-9f54-8076e4677c3a-console-config\") pod \"console-8f89b9d4c-5svtt\" (UID: \"bb1d30f8-6006-4ff5-9f54-8076e4677c3a\") " pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:22.946614 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.946585 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bb1d30f8-6006-4ff5-9f54-8076e4677c3a-service-ca\") pod \"console-8f89b9d4c-5svtt\" (UID: \"bb1d30f8-6006-4ff5-9f54-8076e4677c3a\") " pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:22.946614 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.946604 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb1d30f8-6006-4ff5-9f54-8076e4677c3a-trusted-ca-bundle\") pod \"console-8f89b9d4c-5svtt\" (UID: \"bb1d30f8-6006-4ff5-9f54-8076e4677c3a\") " pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:22.946783 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.946642 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bb1d30f8-6006-4ff5-9f54-8076e4677c3a-oauth-serving-cert\") pod \"console-8f89b9d4c-5svtt\" (UID: \"bb1d30f8-6006-4ff5-9f54-8076e4677c3a\") " pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:22.946783 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.946653 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bb1d30f8-6006-4ff5-9f54-8076e4677c3a-console-config\") pod \"console-8f89b9d4c-5svtt\" (UID: \"bb1d30f8-6006-4ff5-9f54-8076e4677c3a\") " pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:22.948492 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.948468 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bb1d30f8-6006-4ff5-9f54-8076e4677c3a-console-oauth-config\") pod \"console-8f89b9d4c-5svtt\" (UID: \"bb1d30f8-6006-4ff5-9f54-8076e4677c3a\") " pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:22.948628 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.948468 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb1d30f8-6006-4ff5-9f54-8076e4677c3a-console-serving-cert\") pod \"console-8f89b9d4c-5svtt\" (UID: \"bb1d30f8-6006-4ff5-9f54-8076e4677c3a\") " pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:22.955308 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:22.955286 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fjvs\" (UniqueName: \"kubernetes.io/projected/bb1d30f8-6006-4ff5-9f54-8076e4677c3a-kube-api-access-9fjvs\") pod \"console-8f89b9d4c-5svtt\" (UID: \"bb1d30f8-6006-4ff5-9f54-8076e4677c3a\") " pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:23.127892 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:23.127863 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:23.252824 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:23.252799 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8f89b9d4c-5svtt"] Apr 16 18:26:23.255074 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:26:23.255050 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb1d30f8_6006_4ff5_9f54_8076e4677c3a.slice/crio-89b50e4ed000ed6d310ad6b13836c8a2014b3b255b5666d3b432adf3d3c278bf WatchSource:0}: Error finding container 89b50e4ed000ed6d310ad6b13836c8a2014b3b255b5666d3b432adf3d3c278bf: Status 404 returned error can't find the container with id 89b50e4ed000ed6d310ad6b13836c8a2014b3b255b5666d3b432adf3d3c278bf Apr 16 18:26:23.934901 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:23.934866 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8f89b9d4c-5svtt" event={"ID":"bb1d30f8-6006-4ff5-9f54-8076e4677c3a","Type":"ContainerStarted","Data":"78db2281f5726d31636a0ec3851b78da1cd3e1dc442c76f8abc0a48be75a9f53"} Apr 16 18:26:23.934901 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:23.934903 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8f89b9d4c-5svtt" event={"ID":"bb1d30f8-6006-4ff5-9f54-8076e4677c3a","Type":"ContainerStarted","Data":"89b50e4ed000ed6d310ad6b13836c8a2014b3b255b5666d3b432adf3d3c278bf"} Apr 16 18:26:23.960662 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:23.960615 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8f89b9d4c-5svtt" podStartSLOduration=1.960601177 podStartE2EDuration="1.960601177s" podCreationTimestamp="2026-04-16 18:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:26:23.958616603 +0000 UTC m=+585.465881734" watchObservedRunningTime="2026-04-16 18:26:23.960601177 +0000 UTC m=+585.467866293" Apr 16 18:26:27.277155 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.277117 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-l7fgf"] Apr 16 18:26:27.280922 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.280902 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-l7fgf" Apr 16 18:26:27.283812 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.283792 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-p8krk\"" Apr 16 18:26:27.284720 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.284695 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 18:26:27.290358 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.290335 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-l7fgf"] Apr 16 18:26:27.293344 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.293306 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-gpppc"] Apr 16 18:26:27.296796 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.296777 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-gpppc" Apr 16 18:26:27.299243 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.299225 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 18:26:27.299243 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.299235 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-dp4sh\"" Apr 16 18:26:27.309033 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.309001 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-gpppc"] Apr 16 18:26:27.385433 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.385398 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnpxj\" (UniqueName: \"kubernetes.io/projected/e373ca16-fe02-4c6f-929d-0ea49c8b7eba-kube-api-access-jnpxj\") pod \"odh-model-controller-696fc77849-gpppc\" (UID: \"e373ca16-fe02-4c6f-929d-0ea49c8b7eba\") " pod="kserve/odh-model-controller-696fc77849-gpppc" Apr 16 18:26:27.385433 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.385437 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e373ca16-fe02-4c6f-929d-0ea49c8b7eba-cert\") pod \"odh-model-controller-696fc77849-gpppc\" (UID: \"e373ca16-fe02-4c6f-929d-0ea49c8b7eba\") " pod="kserve/odh-model-controller-696fc77849-gpppc" Apr 16 18:26:27.385692 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.385519 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jktxt\" (UniqueName: \"kubernetes.io/projected/8f59b7f1-b9af-4e1f-b44d-2e917e8d4cf9-kube-api-access-jktxt\") pod \"model-serving-api-86f7b4b499-l7fgf\" (UID: \"8f59b7f1-b9af-4e1f-b44d-2e917e8d4cf9\") " pod="kserve/model-serving-api-86f7b4b499-l7fgf" Apr 16 18:26:27.385692 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.385641 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8f59b7f1-b9af-4e1f-b44d-2e917e8d4cf9-tls-certs\") pod \"model-serving-api-86f7b4b499-l7fgf\" (UID: \"8f59b7f1-b9af-4e1f-b44d-2e917e8d4cf9\") " pod="kserve/model-serving-api-86f7b4b499-l7fgf" Apr 16 18:26:27.486452 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.486415 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8f59b7f1-b9af-4e1f-b44d-2e917e8d4cf9-tls-certs\") pod \"model-serving-api-86f7b4b499-l7fgf\" (UID: \"8f59b7f1-b9af-4e1f-b44d-2e917e8d4cf9\") " pod="kserve/model-serving-api-86f7b4b499-l7fgf" Apr 16 18:26:27.486646 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.486487 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jnpxj\" (UniqueName: \"kubernetes.io/projected/e373ca16-fe02-4c6f-929d-0ea49c8b7eba-kube-api-access-jnpxj\") pod \"odh-model-controller-696fc77849-gpppc\" (UID: \"e373ca16-fe02-4c6f-929d-0ea49c8b7eba\") " pod="kserve/odh-model-controller-696fc77849-gpppc" Apr 16 18:26:27.486646 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.486519 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e373ca16-fe02-4c6f-929d-0ea49c8b7eba-cert\") pod \"odh-model-controller-696fc77849-gpppc\" (UID: \"e373ca16-fe02-4c6f-929d-0ea49c8b7eba\") " pod="kserve/odh-model-controller-696fc77849-gpppc" Apr 16 18:26:27.486646 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:26:27.486556 2573 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 16 18:26:27.486646 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.486588 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jktxt\" (UniqueName: \"kubernetes.io/projected/8f59b7f1-b9af-4e1f-b44d-2e917e8d4cf9-kube-api-access-jktxt\") pod \"model-serving-api-86f7b4b499-l7fgf\" (UID: \"8f59b7f1-b9af-4e1f-b44d-2e917e8d4cf9\") " pod="kserve/model-serving-api-86f7b4b499-l7fgf" Apr 16 18:26:27.486646 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:26:27.486628 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f59b7f1-b9af-4e1f-b44d-2e917e8d4cf9-tls-certs podName:8f59b7f1-b9af-4e1f-b44d-2e917e8d4cf9 nodeName:}" failed. No retries permitted until 2026-04-16 18:26:27.986605404 +0000 UTC m=+589.493870504 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/8f59b7f1-b9af-4e1f-b44d-2e917e8d4cf9-tls-certs") pod "model-serving-api-86f7b4b499-l7fgf" (UID: "8f59b7f1-b9af-4e1f-b44d-2e917e8d4cf9") : secret "model-serving-api-tls" not found Apr 16 18:26:27.486841 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:26:27.486798 2573 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 18:26:27.486879 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:26:27.486850 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e373ca16-fe02-4c6f-929d-0ea49c8b7eba-cert podName:e373ca16-fe02-4c6f-929d-0ea49c8b7eba nodeName:}" failed. No retries permitted until 2026-04-16 18:26:27.986825926 +0000 UTC m=+589.494091025 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e373ca16-fe02-4c6f-929d-0ea49c8b7eba-cert") pod "odh-model-controller-696fc77849-gpppc" (UID: "e373ca16-fe02-4c6f-929d-0ea49c8b7eba") : secret "odh-model-controller-webhook-cert" not found Apr 16 18:26:27.495412 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.495387 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnpxj\" (UniqueName: \"kubernetes.io/projected/e373ca16-fe02-4c6f-929d-0ea49c8b7eba-kube-api-access-jnpxj\") pod \"odh-model-controller-696fc77849-gpppc\" (UID: \"e373ca16-fe02-4c6f-929d-0ea49c8b7eba\") " pod="kserve/odh-model-controller-696fc77849-gpppc" Apr 16 18:26:27.495533 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.495459 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jktxt\" (UniqueName: \"kubernetes.io/projected/8f59b7f1-b9af-4e1f-b44d-2e917e8d4cf9-kube-api-access-jktxt\") pod \"model-serving-api-86f7b4b499-l7fgf\" (UID: \"8f59b7f1-b9af-4e1f-b44d-2e917e8d4cf9\") " pod="kserve/model-serving-api-86f7b4b499-l7fgf" Apr 16 18:26:27.991767 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.991728 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8f59b7f1-b9af-4e1f-b44d-2e917e8d4cf9-tls-certs\") pod \"model-serving-api-86f7b4b499-l7fgf\" (UID: \"8f59b7f1-b9af-4e1f-b44d-2e917e8d4cf9\") " pod="kserve/model-serving-api-86f7b4b499-l7fgf" Apr 16 18:26:27.991956 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.991899 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e373ca16-fe02-4c6f-929d-0ea49c8b7eba-cert\") pod \"odh-model-controller-696fc77849-gpppc\" (UID: \"e373ca16-fe02-4c6f-929d-0ea49c8b7eba\") " pod="kserve/odh-model-controller-696fc77849-gpppc" Apr 16 18:26:27.994194 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.994172 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8f59b7f1-b9af-4e1f-b44d-2e917e8d4cf9-tls-certs\") pod \"model-serving-api-86f7b4b499-l7fgf\" (UID: \"8f59b7f1-b9af-4e1f-b44d-2e917e8d4cf9\") " pod="kserve/model-serving-api-86f7b4b499-l7fgf" Apr 16 18:26:27.994304 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:27.994179 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e373ca16-fe02-4c6f-929d-0ea49c8b7eba-cert\") pod \"odh-model-controller-696fc77849-gpppc\" (UID: \"e373ca16-fe02-4c6f-929d-0ea49c8b7eba\") " pod="kserve/odh-model-controller-696fc77849-gpppc" Apr 16 18:26:28.193844 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:28.193812 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-l7fgf" Apr 16 18:26:28.210640 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:28.210607 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-gpppc" Apr 16 18:26:28.325269 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:28.325242 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-l7fgf"] Apr 16 18:26:28.344409 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:28.344382 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-gpppc"] Apr 16 18:26:28.346006 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:26:28.345982 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode373ca16_fe02_4c6f_929d_0ea49c8b7eba.slice/crio-63cf4124bcccab9a6a08e8d54d6872c88ce45854786d7e0f979e06c224b5955b WatchSource:0}: Error finding container 63cf4124bcccab9a6a08e8d54d6872c88ce45854786d7e0f979e06c224b5955b: Status 404 returned error can't find the container with id 63cf4124bcccab9a6a08e8d54d6872c88ce45854786d7e0f979e06c224b5955b Apr 16 18:26:28.955999 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:28.955932 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-l7fgf" event={"ID":"8f59b7f1-b9af-4e1f-b44d-2e917e8d4cf9","Type":"ContainerStarted","Data":"6eb59466460a90c89022d05be93760407775c312da41116e56f8da39e8a2e159"} Apr 16 18:26:28.957343 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:28.957284 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-gpppc" event={"ID":"e373ca16-fe02-4c6f-929d-0ea49c8b7eba","Type":"ContainerStarted","Data":"63cf4124bcccab9a6a08e8d54d6872c88ce45854786d7e0f979e06c224b5955b"} Apr 16 18:26:31.970616 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:31.970582 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-l7fgf" event={"ID":"8f59b7f1-b9af-4e1f-b44d-2e917e8d4cf9","Type":"ContainerStarted","Data":"62ca3a360e3a9b325a3347b8e3dc84a6db7f634c92e19f6b2cdd1d7c37ab328f"} Apr 16 18:26:31.971065 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:31.970686 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-l7fgf" Apr 16 18:26:31.971945 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:31.971914 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-gpppc" event={"ID":"e373ca16-fe02-4c6f-929d-0ea49c8b7eba","Type":"ContainerStarted","Data":"6b2d811260ffb94248fe715c4f8ea8011504e19e2444cf51447fe72c1dcd4f88"} Apr 16 18:26:31.972051 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:31.972016 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-gpppc" Apr 16 18:26:31.989859 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:31.989813 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-l7fgf" podStartSLOduration=1.5280088840000001 podStartE2EDuration="4.989798544s" podCreationTimestamp="2026-04-16 18:26:27 +0000 UTC" firstStartedPulling="2026-04-16 18:26:28.330204775 +0000 UTC m=+589.837469871" lastFinishedPulling="2026-04-16 18:26:31.791994434 +0000 UTC m=+593.299259531" observedRunningTime="2026-04-16 18:26:31.988771189 +0000 UTC m=+593.496036305" watchObservedRunningTime="2026-04-16 18:26:31.989798544 +0000 UTC m=+593.497063659" Apr 16 18:26:32.006877 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:32.006826 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-gpppc" podStartSLOduration=1.5576401340000001 podStartE2EDuration="5.006810881s" podCreationTimestamp="2026-04-16 18:26:27 +0000 UTC" firstStartedPulling="2026-04-16 18:26:28.347646438 +0000 UTC m=+589.854911533" lastFinishedPulling="2026-04-16 18:26:31.796817172 +0000 UTC m=+593.304082280" observedRunningTime="2026-04-16 18:26:32.004928592 +0000 UTC m=+593.512193720" watchObservedRunningTime="2026-04-16 18:26:32.006810881 +0000 UTC m=+593.514076053" Apr 16 18:26:33.128532 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:33.128501 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:33.128924 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:33.128593 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:33.133382 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:33.133359 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:33.983671 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:33.983640 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8f89b9d4c-5svtt" Apr 16 18:26:34.054197 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:34.054159 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5cfcbb68d5-77gc6"] Apr 16 18:26:39.013209 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:39.013180 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-dn4qx_a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0/console-operator/2.log" Apr 16 18:26:39.013691 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:39.013351 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-dn4qx_a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0/console-operator/2.log" Apr 16 18:26:42.977950 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:42.977917 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-gpppc" Apr 16 18:26:42.979957 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:42.979940 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-l7fgf" Apr 16 18:26:59.077859 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.077768 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5cfcbb68d5-77gc6" podUID="db4fd461-bc31-4928-9356-791590b02269" containerName="console" containerID="cri-o://3e65c1b0ee0f1481be032be5524a3ea5e8c7f564e9bb5e3327402b43b1db4a26" gracePeriod=15 Apr 16 18:26:59.326921 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.326901 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5cfcbb68d5-77gc6_db4fd461-bc31-4928-9356-791590b02269/console/0.log" Apr 16 18:26:59.327037 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.326960 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:26:59.465927 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.465894 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-trusted-ca-bundle\") pod \"db4fd461-bc31-4928-9356-791590b02269\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " Apr 16 18:26:59.466118 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.465945 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62hkn\" (UniqueName: \"kubernetes.io/projected/db4fd461-bc31-4928-9356-791590b02269-kube-api-access-62hkn\") pod \"db4fd461-bc31-4928-9356-791590b02269\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " Apr 16 18:26:59.466118 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.465971 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-oauth-serving-cert\") pod \"db4fd461-bc31-4928-9356-791590b02269\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " Apr 16 18:26:59.466118 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.466017 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-console-config\") pod \"db4fd461-bc31-4928-9356-791590b02269\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " Apr 16 18:26:59.466118 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.466042 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db4fd461-bc31-4928-9356-791590b02269-console-oauth-config\") pod \"db4fd461-bc31-4928-9356-791590b02269\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " Apr 16 18:26:59.466118 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.466059 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db4fd461-bc31-4928-9356-791590b02269-console-serving-cert\") pod \"db4fd461-bc31-4928-9356-791590b02269\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " Apr 16 18:26:59.466118 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.466102 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-service-ca\") pod \"db4fd461-bc31-4928-9356-791590b02269\" (UID: \"db4fd461-bc31-4928-9356-791590b02269\") " Apr 16 18:26:59.466531 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.466500 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "db4fd461-bc31-4928-9356-791590b02269" (UID: "db4fd461-bc31-4928-9356-791590b02269"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:26:59.466531 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.466506 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-console-config" (OuterVolumeSpecName: "console-config") pod "db4fd461-bc31-4928-9356-791590b02269" (UID: "db4fd461-bc31-4928-9356-791590b02269"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:26:59.466748 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.466604 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-service-ca" (OuterVolumeSpecName: "service-ca") pod "db4fd461-bc31-4928-9356-791590b02269" (UID: "db4fd461-bc31-4928-9356-791590b02269"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:26:59.466748 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.466688 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "db4fd461-bc31-4928-9356-791590b02269" (UID: "db4fd461-bc31-4928-9356-791590b02269"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:26:59.468149 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.468125 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db4fd461-bc31-4928-9356-791590b02269-kube-api-access-62hkn" (OuterVolumeSpecName: "kube-api-access-62hkn") pod "db4fd461-bc31-4928-9356-791590b02269" (UID: "db4fd461-bc31-4928-9356-791590b02269"). InnerVolumeSpecName "kube-api-access-62hkn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:26:59.468258 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.468234 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db4fd461-bc31-4928-9356-791590b02269-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "db4fd461-bc31-4928-9356-791590b02269" (UID: "db4fd461-bc31-4928-9356-791590b02269"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:26:59.468415 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.468395 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db4fd461-bc31-4928-9356-791590b02269-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "db4fd461-bc31-4928-9356-791590b02269" (UID: "db4fd461-bc31-4928-9356-791590b02269"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:26:59.566711 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.566674 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-console-config\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:26:59.566711 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.566704 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db4fd461-bc31-4928-9356-791590b02269-console-oauth-config\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:26:59.566711 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.566714 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db4fd461-bc31-4928-9356-791590b02269-console-serving-cert\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:26:59.566711 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.566723 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-service-ca\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:26:59.566983 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.566731 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-trusted-ca-bundle\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:26:59.566983 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.566740 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-62hkn\" (UniqueName: \"kubernetes.io/projected/db4fd461-bc31-4928-9356-791590b02269-kube-api-access-62hkn\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:26:59.566983 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:26:59.566749 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db4fd461-bc31-4928-9356-791590b02269-oauth-serving-cert\") on node \"ip-10-0-131-203.ec2.internal\" DevicePath \"\"" Apr 16 18:27:00.066316 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:27:00.066287 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5cfcbb68d5-77gc6_db4fd461-bc31-4928-9356-791590b02269/console/0.log" Apr 16 18:27:00.066489 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:27:00.066328 2573 generic.go:358] "Generic (PLEG): container finished" podID="db4fd461-bc31-4928-9356-791590b02269" containerID="3e65c1b0ee0f1481be032be5524a3ea5e8c7f564e9bb5e3327402b43b1db4a26" exitCode=2 Apr 16 18:27:00.066489 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:27:00.066381 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cfcbb68d5-77gc6" event={"ID":"db4fd461-bc31-4928-9356-791590b02269","Type":"ContainerDied","Data":"3e65c1b0ee0f1481be032be5524a3ea5e8c7f564e9bb5e3327402b43b1db4a26"} Apr 16 18:27:00.066489 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:27:00.066413 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cfcbb68d5-77gc6" event={"ID":"db4fd461-bc31-4928-9356-791590b02269","Type":"ContainerDied","Data":"b8319998d65dd6b53561425d6622f347defbad627685080675c337fbd87557df"} Apr 16 18:27:00.066489 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:27:00.066431 2573 scope.go:117] "RemoveContainer" containerID="3e65c1b0ee0f1481be032be5524a3ea5e8c7f564e9bb5e3327402b43b1db4a26" Apr 16 18:27:00.066675 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:27:00.066432 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cfcbb68d5-77gc6" Apr 16 18:27:00.075170 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:27:00.075154 2573 scope.go:117] "RemoveContainer" containerID="3e65c1b0ee0f1481be032be5524a3ea5e8c7f564e9bb5e3327402b43b1db4a26" Apr 16 18:27:00.075414 ip-10-0-131-203 kubenswrapper[2573]: E0416 18:27:00.075394 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e65c1b0ee0f1481be032be5524a3ea5e8c7f564e9bb5e3327402b43b1db4a26\": container with ID starting with 3e65c1b0ee0f1481be032be5524a3ea5e8c7f564e9bb5e3327402b43b1db4a26 not found: ID does not exist" containerID="3e65c1b0ee0f1481be032be5524a3ea5e8c7f564e9bb5e3327402b43b1db4a26" Apr 16 18:27:00.075477 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:27:00.075422 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e65c1b0ee0f1481be032be5524a3ea5e8c7f564e9bb5e3327402b43b1db4a26"} err="failed to get container status \"3e65c1b0ee0f1481be032be5524a3ea5e8c7f564e9bb5e3327402b43b1db4a26\": rpc error: code = NotFound desc = could not find container \"3e65c1b0ee0f1481be032be5524a3ea5e8c7f564e9bb5e3327402b43b1db4a26\": container with ID starting with 3e65c1b0ee0f1481be032be5524a3ea5e8c7f564e9bb5e3327402b43b1db4a26 not found: ID does not exist" Apr 16 18:27:00.088008 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:27:00.087979 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5cfcbb68d5-77gc6"] Apr 16 18:27:00.091859 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:27:00.091836 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5cfcbb68d5-77gc6"] Apr 16 18:27:01.121492 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:27:01.121463 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db4fd461-bc31-4928-9356-791590b02269" path="/var/lib/kubelet/pods/db4fd461-bc31-4928-9356-791590b02269/volumes" Apr 16 18:31:39.036151 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:31:39.036065 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-dn4qx_a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0/console-operator/2.log" Apr 16 18:31:39.037105 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:31:39.037086 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-dn4qx_a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0/console-operator/2.log" Apr 16 18:36:39.058397 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:36:39.058367 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-dn4qx_a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0/console-operator/2.log" Apr 16 18:36:39.065836 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:36:39.065808 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-dn4qx_a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0/console-operator/2.log" Apr 16 18:41:09.346455 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:09.346423 2573 ???:1] "http: TLS handshake error from 10.0.141.219:36666: EOF" Apr 16 18:41:09.349277 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:09.349255 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-xnggq_7da343e6-5030-4252-8d63-0b49d5c3ff00/global-pull-secret-syncer/0.log" Apr 16 18:41:09.399725 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:09.399700 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-c6ncp_99eb116c-fa90-4ac9-a593-ec208e5f2f43/konnectivity-agent/0.log" Apr 16 18:41:09.510168 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:09.510146 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-203.ec2.internal_cad5fa07f782ca2be3a004d69f182f5f/haproxy/0.log" Apr 16 18:41:13.174235 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:13.174208 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-4czjs_40926aa7-014a-4c73-95f1-c882be5b82a4/cluster-monitoring-operator/0.log" Apr 16 18:41:13.338946 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:13.338919 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8bbb7_85fa889e-121e-49ab-b5e9-49f2f731ad8b/node-exporter/0.log" Apr 16 18:41:13.359149 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:13.359123 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8bbb7_85fa889e-121e-49ab-b5e9-49f2f731ad8b/kube-rbac-proxy/0.log" Apr 16 18:41:13.378966 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:13.378938 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8bbb7_85fa889e-121e-49ab-b5e9-49f2f731ad8b/init-textfile/0.log" Apr 16 18:41:13.543412 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:13.543329 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-5mxgb_bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99/kube-rbac-proxy-main/0.log" Apr 16 18:41:13.575495 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:13.575470 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-5mxgb_bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99/kube-rbac-proxy-self/0.log" Apr 16 18:41:13.600361 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:13.600339 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-5mxgb_bd4a3eb9-d8d8-4957-b1e4-87fc2e796d99/openshift-state-metrics/0.log" Apr 16 18:41:13.798780 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:13.798713 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-d7mmq_bc144f01-96a2-4b5e-bd4c-164a324f11de/prometheus-operator/0.log" Apr 16 18:41:13.819229 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:13.819207 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-d7mmq_bc144f01-96a2-4b5e-bd4c-164a324f11de/kube-rbac-proxy/0.log" Apr 16 18:41:13.842263 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:13.842241 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-wn88l_c547cf5f-3757-43c6-aec3-a7da5a6b053e/prometheus-operator-admission-webhook/0.log" Apr 16 18:41:13.962686 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:13.962655 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-56d858f7df-dh959_87a9b265-79da-4fbd-8c71-83ff25214c57/thanos-query/0.log" Apr 16 18:41:13.995254 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:13.995232 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-56d858f7df-dh959_87a9b265-79da-4fbd-8c71-83ff25214c57/kube-rbac-proxy-web/0.log" Apr 16 18:41:14.026741 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:14.026722 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-56d858f7df-dh959_87a9b265-79da-4fbd-8c71-83ff25214c57/kube-rbac-proxy/0.log" Apr 16 18:41:14.061485 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:14.061439 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-56d858f7df-dh959_87a9b265-79da-4fbd-8c71-83ff25214c57/prom-label-proxy/0.log" Apr 16 18:41:14.090691 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:14.090669 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-56d858f7df-dh959_87a9b265-79da-4fbd-8c71-83ff25214c57/kube-rbac-proxy-rules/0.log" Apr 16 18:41:14.112846 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:14.112822 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-56d858f7df-dh959_87a9b265-79da-4fbd-8c71-83ff25214c57/kube-rbac-proxy-metrics/0.log" Apr 16 18:41:15.579214 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:15.579183 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-dn4qx_a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0/console-operator/2.log" Apr 16 18:41:15.583111 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:15.583091 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-dn4qx_a22e2e3b-7179-49ec-8eda-9b8cf17c2ce0/console-operator/3.log" Apr 16 18:41:15.987037 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:15.987011 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8f89b9d4c-5svtt_bb1d30f8-6006-4ff5-9f54-8076e4677c3a/console/0.log" Apr 16 18:41:16.013583 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.013558 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-bmlww_635733b3-ada2-4619-ac36-00ecd1081e1e/download-server/0.log" Apr 16 18:41:16.441736 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.441706 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx"] Apr 16 18:41:16.442069 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.442057 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db4fd461-bc31-4928-9356-791590b02269" containerName="console" Apr 16 18:41:16.442108 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.442070 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="db4fd461-bc31-4928-9356-791590b02269" containerName="console" Apr 16 18:41:16.442141 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.442132 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="db4fd461-bc31-4928-9356-791590b02269" containerName="console" Apr 16 18:41:16.445107 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.445093 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx" Apr 16 18:41:16.447452 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.447431 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-chptg\"/\"kube-root-ca.crt\"" Apr 16 18:41:16.448244 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.448226 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-chptg\"/\"default-dockercfg-cnqsp\"" Apr 16 18:41:16.448343 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.448226 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-chptg\"/\"openshift-service-ca.crt\"" Apr 16 18:41:16.454193 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.454171 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx"] Apr 16 18:41:16.576299 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.576261 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2ba91857-445d-4e68-afaa-86d1f8cf1655-lib-modules\") pod \"perf-node-gather-daemonset-5jhjx\" (UID: \"2ba91857-445d-4e68-afaa-86d1f8cf1655\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx" Apr 16 18:41:16.576496 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.576305 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2ba91857-445d-4e68-afaa-86d1f8cf1655-sys\") pod \"perf-node-gather-daemonset-5jhjx\" (UID: \"2ba91857-445d-4e68-afaa-86d1f8cf1655\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx" Apr 16 18:41:16.576496 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.576360 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2ba91857-445d-4e68-afaa-86d1f8cf1655-proc\") pod \"perf-node-gather-daemonset-5jhjx\" (UID: \"2ba91857-445d-4e68-afaa-86d1f8cf1655\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx" Apr 16 18:41:16.576496 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.576382 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ss47\" (UniqueName: \"kubernetes.io/projected/2ba91857-445d-4e68-afaa-86d1f8cf1655-kube-api-access-5ss47\") pod \"perf-node-gather-daemonset-5jhjx\" (UID: \"2ba91857-445d-4e68-afaa-86d1f8cf1655\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx" Apr 16 18:41:16.576676 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.576519 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2ba91857-445d-4e68-afaa-86d1f8cf1655-podres\") pod \"perf-node-gather-daemonset-5jhjx\" (UID: \"2ba91857-445d-4e68-afaa-86d1f8cf1655\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx" Apr 16 18:41:16.677347 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.677317 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2ba91857-445d-4e68-afaa-86d1f8cf1655-podres\") pod \"perf-node-gather-daemonset-5jhjx\" (UID: \"2ba91857-445d-4e68-afaa-86d1f8cf1655\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx" Apr 16 18:41:16.677875 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.677349 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2ba91857-445d-4e68-afaa-86d1f8cf1655-lib-modules\") pod \"perf-node-gather-daemonset-5jhjx\" (UID: \"2ba91857-445d-4e68-afaa-86d1f8cf1655\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx" Apr 16 18:41:16.677875 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.677378 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2ba91857-445d-4e68-afaa-86d1f8cf1655-sys\") pod \"perf-node-gather-daemonset-5jhjx\" (UID: \"2ba91857-445d-4e68-afaa-86d1f8cf1655\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx" Apr 16 18:41:16.677875 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.677401 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2ba91857-445d-4e68-afaa-86d1f8cf1655-proc\") pod \"perf-node-gather-daemonset-5jhjx\" (UID: \"2ba91857-445d-4e68-afaa-86d1f8cf1655\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx" Apr 16 18:41:16.677875 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.677474 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2ba91857-445d-4e68-afaa-86d1f8cf1655-sys\") pod \"perf-node-gather-daemonset-5jhjx\" (UID: \"2ba91857-445d-4e68-afaa-86d1f8cf1655\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx" Apr 16 18:41:16.677875 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.677489 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2ba91857-445d-4e68-afaa-86d1f8cf1655-lib-modules\") pod \"perf-node-gather-daemonset-5jhjx\" (UID: \"2ba91857-445d-4e68-afaa-86d1f8cf1655\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx" Apr 16 18:41:16.677875 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.677511 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2ba91857-445d-4e68-afaa-86d1f8cf1655-podres\") pod \"perf-node-gather-daemonset-5jhjx\" (UID: \"2ba91857-445d-4e68-afaa-86d1f8cf1655\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx" Apr 16 18:41:16.677875 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.677531 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2ba91857-445d-4e68-afaa-86d1f8cf1655-proc\") pod \"perf-node-gather-daemonset-5jhjx\" (UID: \"2ba91857-445d-4e68-afaa-86d1f8cf1655\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx" Apr 16 18:41:16.677875 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.677538 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ss47\" (UniqueName: \"kubernetes.io/projected/2ba91857-445d-4e68-afaa-86d1f8cf1655-kube-api-access-5ss47\") pod \"perf-node-gather-daemonset-5jhjx\" (UID: \"2ba91857-445d-4e68-afaa-86d1f8cf1655\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx" Apr 16 18:41:16.685685 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.685658 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ss47\" (UniqueName: \"kubernetes.io/projected/2ba91857-445d-4e68-afaa-86d1f8cf1655-kube-api-access-5ss47\") pod \"perf-node-gather-daemonset-5jhjx\" (UID: \"2ba91857-445d-4e68-afaa-86d1f8cf1655\") " pod="openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx" Apr 16 18:41:16.755832 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.755751 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx" Apr 16 18:41:16.878538 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.875366 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx"] Apr 16 18:41:16.881612 ip-10-0-131-203 kubenswrapper[2573]: W0416 18:41:16.881583 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2ba91857_445d_4e68_afaa_86d1f8cf1655.slice/crio-f4ce12c50ebac3fb9f05f4161f741dcf1b31f8a24718a3d32dd436f7f0c2d82d WatchSource:0}: Error finding container f4ce12c50ebac3fb9f05f4161f741dcf1b31f8a24718a3d32dd436f7f0c2d82d: Status 404 returned error can't find the container with id f4ce12c50ebac3fb9f05f4161f741dcf1b31f8a24718a3d32dd436f7f0c2d82d Apr 16 18:41:16.883467 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:16.883418 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:41:17.277495 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:17.277465 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-whfl9_267cfa25-31fb-4ef1-af56-1f468ac12dc6/dns/0.log" Apr 16 18:41:17.297489 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:17.297466 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-whfl9_267cfa25-31fb-4ef1-af56-1f468ac12dc6/kube-rbac-proxy/0.log" Apr 16 18:41:17.342590 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:17.342566 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ll55v_18fc33d6-c4dd-487b-8457-811880ffd3ea/dns-node-resolver/0.log" Apr 16 18:41:17.842276 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:17.842234 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-d89lv_5146021c-a86d-4b5f-a47d-7f8c736f756e/node-ca/0.log" Apr 16 18:41:17.858265 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:17.858238 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx" event={"ID":"2ba91857-445d-4e68-afaa-86d1f8cf1655","Type":"ContainerStarted","Data":"6760f251631a9c046b41297024aae62721650156c34e031a14aec6ff74f69786"} Apr 16 18:41:17.858437 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:17.858275 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx" event={"ID":"2ba91857-445d-4e68-afaa-86d1f8cf1655","Type":"ContainerStarted","Data":"f4ce12c50ebac3fb9f05f4161f741dcf1b31f8a24718a3d32dd436f7f0c2d82d"} Apr 16 18:41:17.858437 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:17.858327 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx" Apr 16 18:41:17.873691 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:17.873649 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx" podStartSLOduration=1.873631592 podStartE2EDuration="1.873631592s" podCreationTimestamp="2026-04-16 18:41:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:41:17.872908247 +0000 UTC m=+1479.380173375" watchObservedRunningTime="2026-04-16 18:41:17.873631592 +0000 UTC m=+1479.380896707" Apr 16 18:41:18.536419 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:18.536394 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-6564b967f4-t9vbs_38c20e31-955b-4eb0-8e64-330c1b15b52e/router/0.log" Apr 16 18:41:18.900807 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:18.900783 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-fqp4t_b8503a04-7aaa-49ef-bec9-fb099ecb0065/serve-healthcheck-canary/0.log" Apr 16 18:41:19.284558 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:19.284470 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-5hlbs_386777a9-63c1-4fa1-b894-4d73395765d3/insights-operator/0.log" Apr 16 18:41:19.284703 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:19.284619 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-5hlbs_386777a9-63c1-4fa1-b894-4d73395765d3/insights-operator/1.log" Apr 16 18:41:19.432531 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:19.432505 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l8qct_cf14f56c-5787-44f0-936c-3bb1bd030d2b/kube-rbac-proxy/0.log" Apr 16 18:41:19.452841 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:19.452819 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l8qct_cf14f56c-5787-44f0-936c-3bb1bd030d2b/exporter/0.log" Apr 16 18:41:19.471958 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:19.471934 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l8qct_cf14f56c-5787-44f0-936c-3bb1bd030d2b/extractor/0.log" Apr 16 18:41:21.436222 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:21.436194 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-x742d_ac39fd98-705a-44a6-8aa8-16a419e83ada/manager/0.log" Apr 16 18:41:21.457325 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:21.457306 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-l7fgf_8f59b7f1-b9af-4e1f-b44d-2e917e8d4cf9/server/0.log" Apr 16 18:41:21.533208 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:21.533178 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-gpppc_e373ca16-fe02-4c6f-929d-0ea49c8b7eba/manager/0.log" Apr 16 18:41:23.871559 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:23.871515 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-chptg/perf-node-gather-daemonset-5jhjx" Apr 16 18:41:26.913928 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:26.913898 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lp5l8_e39063f8-caec-45bd-ae4f-e11765edec8b/kube-multus-additional-cni-plugins/0.log" Apr 16 18:41:26.936716 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:26.936692 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lp5l8_e39063f8-caec-45bd-ae4f-e11765edec8b/egress-router-binary-copy/0.log" Apr 16 18:41:26.959049 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:26.959025 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lp5l8_e39063f8-caec-45bd-ae4f-e11765edec8b/cni-plugins/0.log" Apr 16 18:41:26.979843 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:26.979818 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lp5l8_e39063f8-caec-45bd-ae4f-e11765edec8b/bond-cni-plugin/0.log" Apr 16 18:41:27.000824 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:27.000802 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lp5l8_e39063f8-caec-45bd-ae4f-e11765edec8b/routeoverride-cni/0.log" Apr 16 18:41:27.023749 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:27.023723 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lp5l8_e39063f8-caec-45bd-ae4f-e11765edec8b/whereabouts-cni-bincopy/0.log" Apr 16 18:41:27.045458 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:27.045435 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lp5l8_e39063f8-caec-45bd-ae4f-e11765edec8b/whereabouts-cni/0.log" Apr 16 18:41:27.241637 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:27.241568 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kq8d4_c4168765-8ac0-4395-a56d-b2991fa122e3/kube-multus/0.log" Apr 16 18:41:27.385186 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:27.385159 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-sw2bl_a7930c14-4ef0-4949-a2ae-9a240da66c3c/network-metrics-daemon/0.log" Apr 16 18:41:27.406454 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:27.406430 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-sw2bl_a7930c14-4ef0-4949-a2ae-9a240da66c3c/kube-rbac-proxy/0.log" Apr 16 18:41:28.440245 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:28.440211 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj7xv_7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3/ovn-controller/0.log" Apr 16 18:41:28.474438 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:28.474413 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj7xv_7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3/ovn-acl-logging/0.log" Apr 16 18:41:28.506034 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:28.506010 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj7xv_7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3/kube-rbac-proxy-node/0.log" Apr 16 18:41:28.530993 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:28.530973 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj7xv_7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 18:41:28.556749 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:28.556728 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj7xv_7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3/northd/0.log" Apr 16 18:41:28.598816 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:28.598792 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj7xv_7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3/nbdb/0.log" Apr 16 18:41:28.624452 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:28.624417 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj7xv_7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3/sbdb/0.log" Apr 16 18:41:28.716379 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:28.716306 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tj7xv_7e7d73ee-b24c-4ba6-94cc-4c6e3044a3f3/ovnkube-controller/0.log" Apr 16 18:41:30.086173 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:30.086149 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-8282h_ec84b517-4170-4c21-b909-567d1c8fe013/check-endpoints/0.log" Apr 16 18:41:30.174887 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:30.174861 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-jgrxb_2571d812-7882-455b-be2f-4e3888df0e6a/network-check-target-container/0.log" Apr 16 18:41:31.149938 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:31.149913 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-t6c2c_24c29d62-5808-4ee7-92cd-ce4e68faf741/iptables-alerter/0.log" Apr 16 18:41:31.815037 ip-10-0-131-203 kubenswrapper[2573]: I0416 18:41:31.815011 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-bsv9k_23e0fe0d-a889-48ef-942d-1a02dac9ac5e/tuned/0.log"