Mar 18 16:44:32.162266 ip-10-0-135-99 systemd[1]: Starting Kubernetes Kubelet... Mar 18 16:44:32.627096 ip-10-0-135-99 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:32.627096 ip-10-0-135-99 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 16:44:32.627096 ip-10-0-135-99 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:32.627096 ip-10-0-135-99 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 18 16:44:32.627096 ip-10-0-135-99 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:32.629223 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.629123 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 16:44:32.632425 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632403 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:32.632425 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632424 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:32.632495 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632428 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:32.632495 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632431 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:32.632495 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632434 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:32.632495 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632437 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:32.632495 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632440 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:32.632495 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632443 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:32.632495 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632446 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:32.632495 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632449 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:32.632495 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632452 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:32.632495 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632462 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:32.632495 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632465 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:32.632495 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632468 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:32.632495 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632470 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:32.632495 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632473 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:32.632495 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632476 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:32.632495 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632478 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:32.632495 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632481 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:32.632495 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632483 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:32.632495 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632486 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:32.632495 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632489 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:32.633005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632491 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:32.633005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632494 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:32.633005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632497 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:32.633005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632501 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:32.633005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632504 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:32.633005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632507 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:32.633005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632510 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:32.633005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632513 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:32.633005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632516 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:32.633005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632518 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:32.633005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632521 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:32.633005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632524 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:32.633005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632526 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:32.633005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632529 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:32.633005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632532 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:32.633005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632536 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:32.633005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632539 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:32.633005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632541 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:32.633005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632544 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:32.633005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632546 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:32.633501 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632549 2573 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:32.633501 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632551 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:32.633501 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632554 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:32.633501 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632557 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:32.633501 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632560 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:32.633501 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632563 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:32.633501 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632566 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:32.633501 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632568 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:32.633501 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632573 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:32.633501 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632577 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:32.633501 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632580 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:32.633501 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632583 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:32.633501 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632586 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:32.633501 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632589 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:32.633501 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632592 2573 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:32.633501 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632595 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:32.633501 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632597 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:32.633501 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632600 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:32.633501 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632603 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:32.633501 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632606 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:32.634005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632608 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:32.634005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632611 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:32.634005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632613 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:32.634005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632616 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:32.634005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632618 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:32.634005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632621 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:32.634005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632624 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:32.634005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632626 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:32.634005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632629 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:32.634005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632632 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:32.634005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632637 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:32.634005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632640 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:32.634005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632643 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:32.634005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632646 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:32.634005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632649 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:32.634005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632652 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:32.634005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632654 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:32.634005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632657 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:32.634005 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632659 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:32.634456 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632662 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:32.634456 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632665 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:32.634456 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632669 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:32.634456 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632671 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:32.634456 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.632674 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:32.634456 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633166 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:32.634456 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633176 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:32.634456 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633180 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:32.634456 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633185 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:32.634456 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633189 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:32.634456 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633192 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:32.634456 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633195 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:32.634456 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633198 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:32.634456 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633201 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:32.634456 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633204 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:32.634456 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633206 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:32.634456 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633209 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:32.634456 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633212 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:32.634900 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633214 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:32.634900 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633217 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:32.634900 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633220 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:32.634900 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633223 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:32.634900 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633226 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:32.634900 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633229 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:32.634900 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633231 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:32.634900 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633234 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:32.634900 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633237 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:32.634900 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633239 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:32.634900 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633242 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:32.634900 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633244 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:32.634900 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633247 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:32.634900 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633249 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:32.634900 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633252 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:32.634900 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633255 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:32.634900 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633257 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:32.634900 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633260 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:32.634900 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633262 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:32.634900 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633265 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:32.635403 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633269 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:32.635403 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633272 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:32.635403 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633275 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:32.635403 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633277 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:32.635403 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633280 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:32.635403 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633283 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:32.635403 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633286 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:32.635403 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633289 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:32.635403 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633291 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:32.635403 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633294 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:32.635403 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633296 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:32.635403 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633299 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:32.635403 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633301 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:32.635403 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633304 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:32.635403 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633306 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:32.635403 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633309 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:32.635403 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633312 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:32.635403 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633315 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:32.635403 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633317 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:32.635403 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633320 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:32.635895 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633322 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:32.635895 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633325 2573 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:32.635895 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633327 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:32.635895 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633330 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:32.635895 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633333 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:32.635895 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633335 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:32.635895 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633338 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:32.635895 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633340 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:32.635895 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633343 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:32.635895 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633346 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:32.635895 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633349 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:32.635895 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633351 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:32.635895 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633355 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:32.635895 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633357 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:32.635895 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633360 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:32.635895 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633363 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:32.635895 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633365 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:32.635895 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633368 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:32.635895 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633371 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:32.635895 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633373 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:32.636400 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633376 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:32.636400 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633379 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:32.636400 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633383 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:32.636400 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633385 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:32.636400 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633388 2573 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:32.636400 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633391 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:32.636400 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633393 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:32.636400 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633396 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:32.636400 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633399 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:32.636400 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633402 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:32.636400 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633404 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:32.636400 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633407 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:32.636400 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.633409 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:32.636400 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634791 2573 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 16:44:32.636400 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634803 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 16:44:32.636400 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634811 2573 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 16:44:32.636400 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634816 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 16:44:32.636400 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634821 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 16:44:32.636400 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634824 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 16:44:32.636400 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634830 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 16:44:32.636400 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634835 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634838 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634842 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634845 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634849 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634853 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634856 2573 flags.go:64] FLAG: --cgroup-root="" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634859 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634862 2573 flags.go:64] FLAG: --client-ca-file="" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634865 2573 flags.go:64] FLAG: --cloud-config="" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634868 2573 flags.go:64] FLAG: --cloud-provider="external" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634871 2573 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634877 2573 flags.go:64] FLAG: --cluster-domain="" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634880 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634883 2573 flags.go:64] FLAG: --config-dir="" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634886 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634889 2573 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634894 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634897 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634900 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634904 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634907 2573 flags.go:64] FLAG: --contention-profiling="false" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634911 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634915 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634918 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 16:44:32.636953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634922 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634926 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634930 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634933 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634946 2573 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634950 2573 flags.go:64] FLAG: --enable-server="true" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634953 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634958 2573 flags.go:64] FLAG: --event-burst="100" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634961 2573 flags.go:64] FLAG: --event-qps="50" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634964 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634967 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634971 2573 flags.go:64] FLAG: --eviction-hard="" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634975 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634978 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634981 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634985 2573 flags.go:64] FLAG: --eviction-soft="" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634988 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634992 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634995 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.634998 2573 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635001 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635004 2573 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635007 2573 flags.go:64] FLAG: --feature-gates="" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635011 2573 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635014 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 16:44:32.637561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635018 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635021 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635024 2573 flags.go:64] FLAG: --healthz-port="10248" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635033 2573 flags.go:64] FLAG: --help="false" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635036 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-135-99.ec2.internal" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635040 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635043 2573 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635046 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635050 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635053 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635057 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635060 2573 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635063 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635066 2573 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635069 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635072 2573 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635075 2573 flags.go:64] FLAG: --kube-reserved="" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635079 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635082 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635085 2573 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635088 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635091 2573 flags.go:64] FLAG: --lock-file="" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635095 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635098 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 16:44:32.638171 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635101 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 16:44:32.638798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635108 2573 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 16:44:32.638798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635111 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 16:44:32.638798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635114 2573 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 16:44:32.638798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635117 2573 flags.go:64] FLAG: --logging-format="text" Mar 18 16:44:32.638798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635120 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 16:44:32.638798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635124 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 16:44:32.638798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635127 2573 flags.go:64] FLAG: --manifest-url="" Mar 18 16:44:32.638798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635130 2573 flags.go:64] FLAG: --manifest-url-header="" Mar 18 16:44:32.638798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635135 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 16:44:32.638798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635138 2573 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 16:44:32.638798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635144 2573 flags.go:64] FLAG: --max-pods="110" Mar 18 16:44:32.638798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635148 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 16:44:32.638798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635151 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 16:44:32.638798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635155 2573 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 16:44:32.638798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635158 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 16:44:32.638798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635161 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 16:44:32.638798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635164 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Mar 18 16:44:32.638798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635168 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Mar 18 16:44:32.638798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635176 2573 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 16:44:32.638798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635179 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 16:44:32.638798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635182 2573 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 16:44:32.638798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635185 2573 flags.go:64] FLAG: --pod-cidr="" Mar 18 16:44:32.638798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635188 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b3115b2610585407ab0742648cfbe39c72f57482889f0e778f5ac6fdc482217b" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635195 2573 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635198 2573 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635201 2573 flags.go:64] FLAG: --pods-per-core="0" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635204 2573 flags.go:64] FLAG: --port="10250" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635207 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635211 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-012af27fb2c8055d4" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635214 2573 flags.go:64] FLAG: --qos-reserved="" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635218 2573 flags.go:64] FLAG: --read-only-port="10255" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635221 2573 flags.go:64] FLAG: --register-node="true" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635224 2573 flags.go:64] FLAG: --register-schedulable="true" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635227 2573 flags.go:64] FLAG: --register-with-taints="" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635231 2573 flags.go:64] FLAG: --registry-burst="10" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635234 2573 flags.go:64] FLAG: --registry-qps="5" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635236 2573 flags.go:64] FLAG: --reserved-cpus="" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635239 2573 flags.go:64] FLAG: --reserved-memory="" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635247 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635250 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635253 2573 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635256 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635259 2573 flags.go:64] FLAG: --runonce="false" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635263 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635267 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635270 2573 flags.go:64] FLAG: --seccomp-default="false" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635274 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635277 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 16:44:32.639380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635281 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635284 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635287 2573 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635290 2573 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635293 2573 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635297 2573 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635300 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635303 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635306 2573 flags.go:64] FLAG: --system-cgroups="" Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635309 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635315 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635317 2573 flags.go:64] FLAG: --tls-cert-file="" Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635321 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635326 2573 flags.go:64] FLAG: --tls-min-version="" Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635329 2573 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635332 2573 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635335 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635338 2573 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635341 2573 flags.go:64] FLAG: --v="2" Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635346 2573 flags.go:64] FLAG: --version="false" Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635350 2573 flags.go:64] FLAG: --vmodule="" Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635355 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.635358 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635455 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:32.640062 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635458 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:32.640727 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635461 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:32.640727 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635465 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:32.640727 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635469 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:32.640727 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635472 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:32.640727 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635476 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:32.640727 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635478 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:32.640727 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635481 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:32.640727 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635484 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:32.640727 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635487 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:32.640727 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635490 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:32.640727 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635492 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:32.640727 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635495 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:32.640727 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635498 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:32.640727 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635502 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:32.640727 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635506 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:32.640727 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635510 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:32.640727 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635513 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:32.640727 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635516 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:32.640727 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635519 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:32.641249 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635522 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:32.641249 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635527 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:32.641249 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635529 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:32.641249 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635532 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:32.641249 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635535 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:32.641249 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635538 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:32.641249 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635540 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:32.641249 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635543 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:32.641249 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635546 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:32.641249 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635548 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:32.641249 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635551 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:32.641249 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635553 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:32.641249 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635556 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:32.641249 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635559 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:32.641249 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635561 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:32.641249 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635565 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:32.641249 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635568 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:32.641249 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635571 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:32.641249 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635573 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:32.641249 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635576 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:32.641764 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635579 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:32.641764 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635582 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:32.641764 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635585 2573 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:32.641764 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635588 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:32.641764 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635590 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:32.641764 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635593 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:32.641764 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635596 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:32.641764 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635598 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:32.641764 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635601 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:32.641764 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635604 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:32.641764 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635607 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:32.641764 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635610 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:32.641764 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635612 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:32.641764 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635617 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:32.641764 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635619 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:32.641764 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635622 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:32.641764 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635625 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:32.641764 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635629 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:32.641764 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635632 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:32.641764 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635635 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:32.642280 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635638 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:32.642280 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635641 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:32.642280 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635643 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:32.642280 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635646 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:32.642280 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635648 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:32.642280 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635651 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:32.642280 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635654 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:32.642280 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635658 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:32.642280 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635660 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:32.642280 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635663 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:32.642280 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635666 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:32.642280 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635668 2573 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:32.642280 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635671 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:32.642280 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635675 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:32.642280 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635678 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:32.642280 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635680 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:32.642280 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635683 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:32.642280 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635685 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:32.642280 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635688 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:32.642280 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635691 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:32.642860 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635693 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:32.642860 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635696 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:32.642860 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635699 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:32.642860 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635701 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:32.642860 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.635704 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:32.642860 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.636286 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:32.645404 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.645380 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 18 16:44:32.645404 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.645405 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 16:44:32.645478 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645460 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:32.645478 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645466 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:32.645478 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645470 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:32.645478 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645473 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:32.645478 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645476 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:32.645478 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645480 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:32.645628 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645483 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:32.645628 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645486 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:32.645628 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645489 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:32.645628 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645492 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:32.645628 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645496 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:32.645628 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645498 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:32.645628 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645501 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:32.645628 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645504 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:32.645628 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645507 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:32.645628 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645510 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:32.645628 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645513 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:32.645628 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645516 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:32.645628 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645519 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:32.645628 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645522 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:32.645628 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645524 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:32.645628 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645529 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:32.645628 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645533 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:32.645628 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645537 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:32.645628 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645540 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:32.645628 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645543 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:32.646165 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645545 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:32.646165 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645548 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:32.646165 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645551 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:32.646165 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645553 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:32.646165 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645556 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:32.646165 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645559 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:32.646165 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645561 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:32.646165 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645564 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:32.646165 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645567 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:32.646165 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645569 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:32.646165 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645572 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:32.646165 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645574 2573 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:32.646165 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645577 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:32.646165 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645581 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:32.646165 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645584 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:32.646165 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645587 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:32.646165 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645589 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:32.646165 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645592 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:32.646165 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645595 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:32.646165 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645598 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:32.646663 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645600 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:32.646663 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645603 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:32.646663 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645606 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:32.646663 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645609 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:32.646663 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645612 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:32.646663 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645615 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:32.646663 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645618 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:32.646663 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645620 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:32.646663 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645623 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:32.646663 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645626 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:32.646663 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645628 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:32.646663 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645632 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:32.646663 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645637 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:32.646663 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645640 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:32.646663 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645644 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:32.646663 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645647 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:32.646663 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645650 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:32.646663 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645653 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:32.646663 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645656 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:32.646663 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645659 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:32.647175 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645662 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:32.647175 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645665 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:32.647175 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645667 2573 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:32.647175 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645670 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:32.647175 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645672 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:32.647175 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645676 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:32.647175 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645679 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:32.647175 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645682 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:32.647175 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645685 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:32.647175 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645687 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:32.647175 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645698 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:32.647175 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645701 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:32.647175 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645704 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:32.647175 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645707 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:32.647175 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645709 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:32.647175 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645712 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:32.647175 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645715 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:32.647175 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645717 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:32.647175 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645720 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:32.647645 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645723 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:32.647645 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.645728 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:32.647645 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645826 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:32.647645 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645831 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:32.647645 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645834 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:32.647645 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645837 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:32.647645 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645840 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:32.647645 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645843 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:32.647645 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645846 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:32.647645 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645849 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:32.647645 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645851 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:32.647645 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645854 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:32.647645 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645857 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:32.647645 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645859 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:32.647645 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645862 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:32.648034 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645865 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:32.648034 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645867 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:32.648034 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645871 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:32.648034 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645874 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:32.648034 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645878 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:32.648034 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645880 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:32.648034 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645883 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:32.648034 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645886 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:32.648034 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645888 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:32.648034 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645892 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:32.648034 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645895 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:32.648034 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645897 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:32.648034 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645900 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:32.648034 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645903 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:32.648034 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645905 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:32.648034 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645908 2573 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:32.648034 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645911 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:32.648034 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645913 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:32.648034 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645916 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:32.648034 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645919 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:32.648526 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645921 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:32.648526 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645924 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:32.648526 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645926 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:32.648526 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645929 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:32.648526 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645931 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:32.648526 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645934 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:32.648526 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645953 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:32.648526 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645957 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:32.648526 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645960 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:32.648526 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645963 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:32.648526 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645965 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:32.648526 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645968 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:32.648526 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645971 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:32.648526 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645973 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:32.648526 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645976 2573 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:32.648526 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645980 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:32.648526 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645983 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:32.648526 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645986 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:32.648526 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645989 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:32.648526 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645992 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:32.649230 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645994 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:32.649230 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.645997 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:32.649230 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646000 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:32.649230 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646003 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:32.649230 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646006 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:32.649230 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646009 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:32.649230 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646011 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:32.649230 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646014 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:32.649230 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646017 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:32.649230 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646020 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:32.649230 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646023 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:32.649230 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646025 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:32.649230 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646029 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:32.649230 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646032 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:32.649230 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646035 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:32.649230 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646038 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:32.649230 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646040 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:32.649230 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646043 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:32.649230 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646047 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:32.649679 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646050 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:32.649679 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646054 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:32.649679 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646057 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:32.649679 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646060 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:32.649679 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646063 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:32.649679 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646066 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:32.649679 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646069 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:32.649679 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646071 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:32.649679 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646074 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:32.649679 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646077 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:32.649679 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646080 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:32.649679 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646083 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:32.649679 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646086 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:32.649679 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:32.646088 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:32.649679 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.646093 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:32.650065 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.646739 2573 server.go:962] "Client rotation is on, will bootstrap in background" Mar 18 16:44:32.651156 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.651139 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 16:44:32.652117 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.652104 2573 server.go:1019] "Starting client certificate rotation" Mar 18 16:44:32.652218 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.652203 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:44:32.652254 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.652234 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:44:32.676950 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.676914 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:44:32.678906 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.678883 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:44:32.694083 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.694054 2573 log.go:25] "Validated CRI v1 runtime API" Mar 18 16:44:32.700194 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.700174 2573 log.go:25] "Validated CRI v1 image API" Mar 18 16:44:32.702617 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.702581 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 16:44:32.706789 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.706762 2573 fs.go:135] Filesystem UUIDs: map[79099f59-5064-4181-bc88-60cf948c895d:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 ef5f092f-0967-4a49-8d59-1a0df8563da3:/dev/nvme0n1p4] Mar 18 16:44:32.706886 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.706786 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Mar 18 16:44:32.710984 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.710958 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:44:32.712895 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.712754 2573 manager.go:217] Machine: {Timestamp:2026-03-18 16:44:32.7107433 +0000 UTC m=+0.426166289 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099410 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c78fe5dddeee2f680826fcefa9478 SystemUUID:ec2c78fe-5ddd-eee2-f680-826fcefa9478 BootID:3d347408-ac50-4b0e-839c-c455eb2028c5 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6094848 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:84:78:7e:86:8b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:84:78:7e:86:8b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:be:a4:f4:04:6b:ea Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 16:44:32.712895 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.712861 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 16:44:32.713055 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.712975 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.96.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260303-1 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 16:44:32.714130 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.714107 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 16:44:32.714283 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.714133 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-99.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 16:44:32.714329 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.714292 2573 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 16:44:32.714329 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.714302 2573 container_manager_linux.go:306] "Creating device plugin manager" Mar 18 16:44:32.714329 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.714315 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:44:32.715039 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.715027 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:44:32.716384 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.716374 2573 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:44:32.716509 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.716499 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Mar 18 16:44:32.718720 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.718707 2573 kubelet.go:491] "Attempting to sync node with API server" Mar 18 16:44:32.719415 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.719404 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 16:44:32.719453 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.719435 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 16:44:32.719453 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.719446 2573 kubelet.go:397] "Adding apiserver pod source" Mar 18 16:44:32.719510 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.719456 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 16:44:32.720744 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.720728 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:44:32.720834 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.720760 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:44:32.723994 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.723976 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.9-3.rhaos4.20.gitb9ac835.el9" apiVersion="v1" Mar 18 16:44:32.725984 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.725970 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 18 16:44:32.728373 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.728360 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 16:44:32.728423 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.728378 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 16:44:32.728423 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.728385 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 16:44:32.728423 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.728391 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 16:44:32.728423 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.728397 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 16:44:32.728423 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.728403 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 16:44:32.728423 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.728409 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 16:44:32.728423 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.728414 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 16:44:32.728423 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.728421 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 16:44:32.728423 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.728428 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 16:44:32.728655 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.728437 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 16:44:32.728655 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.728446 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 16:44:32.729307 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.729292 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 16:44:32.729403 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.729331 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Mar 18 16:44:32.731600 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.731555 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-99.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 16:44:32.732190 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:32.732168 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-99.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 18 16:44:32.732241 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:32.732168 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 18 16:44:32.733020 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.733008 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 18 16:44:32.733120 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.733060 2573 server.go:1295] "Started kubelet" Mar 18 16:44:32.733295 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.733250 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 16:44:32.733359 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.733348 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 18 16:44:32.733546 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.733418 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 16:44:32.734099 ip-10-0-135-99 systemd[1]: Started Kubernetes Kubelet. Mar 18 16:44:32.735183 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.735170 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 16:44:32.736096 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.736076 2573 server.go:317] "Adding debug handlers to kubelet server" Mar 18 16:44:32.741185 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.741165 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 16:44:32.741298 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.741195 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Mar 18 16:44:32.741859 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.741834 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 18 16:44:32.742127 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.742112 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Mar 18 16:44:32.742253 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.742240 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 18 16:44:32.742478 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.742466 2573 reconstruct.go:97] "Volume reconstruction finished" Mar 18 16:44:32.742598 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.742570 2573 reconciler.go:26] "Reconciler: start to sync state" Mar 18 16:44:32.742598 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:32.742461 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-99.ec2.internal\" not found" Mar 18 16:44:32.742752 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:32.742682 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 18 16:44:32.742852 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.742831 2573 factory.go:153] Registering CRI-O factory Mar 18 16:44:32.742967 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.742896 2573 factory.go:223] Registration of the crio container factory successfully Mar 18 16:44:32.743108 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.743093 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 16:44:32.743156 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.743110 2573 factory.go:55] Registering systemd factory Mar 18 16:44:32.743156 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.743119 2573 factory.go:223] Registration of the systemd container factory successfully Mar 18 16:44:32.743156 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.743145 2573 factory.go:103] Registering Raw factory Mar 18 16:44:32.743288 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.743161 2573 manager.go:1196] Started watching for new ooms in manager Mar 18 16:44:32.743871 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.743857 2573 manager.go:319] Starting recovery of all containers Mar 18 16:44:32.749901 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.749863 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 18 16:44:32.750155 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:32.750131 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 18 16:44:32.750259 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:32.750177 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-135-99.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Mar 18 16:44:32.751082 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:32.750106 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-99.ec2.internal.189dfd3e684563e8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-99.ec2.internal,UID:ip-10-0-135-99.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-99.ec2.internal,},FirstTimestamp:2026-03-18 16:44:32.73302116 +0000 UTC m=+0.448444149,LastTimestamp:2026-03-18 16:44:32.73302116 +0000 UTC m=+0.448444149,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-99.ec2.internal,}" Mar 18 16:44:32.754315 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.754127 2573 manager.go:324] Recovery completed Mar 18 16:44:32.758886 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.758873 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:32.759536 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.759516 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vc9vc" Mar 18 16:44:32.763130 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.763113 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-99.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:32.763210 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.763146 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-99.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:32.763210 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.763161 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-99.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:32.763703 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.763689 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Mar 18 16:44:32.763703 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.763703 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Mar 18 16:44:32.763788 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.763720 2573 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:44:32.764685 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:32.764619 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-99.ec2.internal.189dfd3e6a10d074 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-99.ec2.internal,UID:ip-10-0-135-99.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-135-99.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-135-99.ec2.internal,},FirstTimestamp:2026-03-18 16:44:32.763129972 +0000 UTC m=+0.478552966,LastTimestamp:2026-03-18 16:44:32.763129972 +0000 UTC m=+0.478552966,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-99.ec2.internal,}" Mar 18 16:44:32.764896 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.764878 2573 policy_none.go:49] "None policy: Start" Mar 18 16:44:32.764934 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.764905 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 18 16:44:32.764934 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.764918 2573 state_mem.go:35] "Initializing new in-memory state store" Mar 18 16:44:32.767993 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.767977 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vc9vc" Mar 18 16:44:32.812685 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.809663 2573 manager.go:341] "Starting Device Plugin manager" Mar 18 16:44:32.812685 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:32.809709 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 18 16:44:32.812685 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.809721 2573 server.go:85] "Starting device plugin registration server" Mar 18 16:44:32.812685 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.810024 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 16:44:32.812685 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.810038 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 16:44:32.812685 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.810146 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 16:44:32.812685 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.810244 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 16:44:32.812685 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.810252 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 16:44:32.812685 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:32.810771 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Mar 18 16:44:32.812685 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:32.810804 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-99.ec2.internal\" not found" Mar 18 16:44:32.883077 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.882999 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 18 16:44:32.883077 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.883036 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 18 16:44:32.883077 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.883061 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 18 16:44:32.883077 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.883072 2573 kubelet.go:2451] "Starting kubelet main sync loop" Mar 18 16:44:32.883327 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:32.883112 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 18 16:44:32.887997 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.887973 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:32.910647 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.910594 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:32.911648 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.911628 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-99.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:32.911757 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.911661 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-99.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:32.911757 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.911673 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-99.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:32.911757 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.911700 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-99.ec2.internal" Mar 18 16:44:32.921537 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.921516 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-99.ec2.internal" Mar 18 16:44:32.921644 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:32.921543 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-99.ec2.internal\": node \"ip-10-0-135-99.ec2.internal\" not found" Mar 18 16:44:32.936659 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:32.936621 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-99.ec2.internal\" not found" Mar 18 16:44:32.983635 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.983569 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-99.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-99.ec2.internal"] Mar 18 16:44:32.983806 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.983688 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:32.984698 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.984680 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-99.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:32.984787 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.984711 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-99.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:32.984787 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.984720 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-99.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:32.986118 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.986102 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:32.986273 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.986256 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-99.ec2.internal" Mar 18 16:44:32.986336 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.986294 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:32.986911 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.986893 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-99.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:32.987021 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.986927 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-99.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:32.987021 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.986895 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-99.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:32.987021 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.986960 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-99.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:32.987021 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.986971 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-99.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:32.987021 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.986981 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-99.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:32.988151 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.988137 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-99.ec2.internal" Mar 18 16:44:32.988227 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.988167 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:32.989005 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.988985 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-99.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:32.989005 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.989005 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-99.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:32.989157 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:32.989018 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-99.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:33.009902 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:33.009875 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-99.ec2.internal\" not found" node="ip-10-0-135-99.ec2.internal" Mar 18 16:44:33.014745 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:33.014723 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-99.ec2.internal\" not found" node="ip-10-0-135-99.ec2.internal" Mar 18 16:44:33.037456 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:33.037423 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-99.ec2.internal\" not found" Mar 18 16:44:33.044410 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:33.044387 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/760d2cc06861e4c286d713e86b836b9f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-99.ec2.internal\" (UID: \"760d2cc06861e4c286d713e86b836b9f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-99.ec2.internal" Mar 18 16:44:33.044520 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:33.044419 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/760d2cc06861e4c286d713e86b836b9f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-99.ec2.internal\" (UID: \"760d2cc06861e4c286d713e86b836b9f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-99.ec2.internal" Mar 18 16:44:33.044520 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:33.044446 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ccad537f0fa817f43bb3d0d1231fcf27-config\") pod \"kube-apiserver-proxy-ip-10-0-135-99.ec2.internal\" (UID: \"ccad537f0fa817f43bb3d0d1231fcf27\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-99.ec2.internal" Mar 18 16:44:33.137632 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:33.137541 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-99.ec2.internal\" not found" Mar 18 16:44:33.145043 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:33.145013 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/760d2cc06861e4c286d713e86b836b9f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-99.ec2.internal\" (UID: \"760d2cc06861e4c286d713e86b836b9f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-99.ec2.internal" Mar 18 16:44:33.145126 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:33.145052 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ccad537f0fa817f43bb3d0d1231fcf27-config\") pod \"kube-apiserver-proxy-ip-10-0-135-99.ec2.internal\" (UID: \"ccad537f0fa817f43bb3d0d1231fcf27\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-99.ec2.internal" Mar 18 16:44:33.145126 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:33.145071 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/760d2cc06861e4c286d713e86b836b9f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-99.ec2.internal\" (UID: \"760d2cc06861e4c286d713e86b836b9f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-99.ec2.internal" Mar 18 16:44:33.145126 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:33.145113 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/760d2cc06861e4c286d713e86b836b9f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-99.ec2.internal\" (UID: \"760d2cc06861e4c286d713e86b836b9f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-99.ec2.internal" Mar 18 16:44:33.145126 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:33.145117 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/760d2cc06861e4c286d713e86b836b9f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-99.ec2.internal\" (UID: \"760d2cc06861e4c286d713e86b836b9f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-99.ec2.internal" Mar 18 16:44:33.145268 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:33.145133 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ccad537f0fa817f43bb3d0d1231fcf27-config\") pod \"kube-apiserver-proxy-ip-10-0-135-99.ec2.internal\" (UID: \"ccad537f0fa817f43bb3d0d1231fcf27\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-99.ec2.internal" Mar 18 16:44:33.238460 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:33.238411 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-99.ec2.internal\" not found" Mar 18 16:44:33.312959 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:33.312910 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-99.ec2.internal" Mar 18 16:44:33.317671 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:33.317651 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-99.ec2.internal" Mar 18 16:44:33.338593 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:33.338557 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-99.ec2.internal\" not found" Mar 18 16:44:33.439199 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:33.439168 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-99.ec2.internal\" not found" Mar 18 16:44:33.539722 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:33.539680 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-99.ec2.internal\" not found" Mar 18 16:44:33.640235 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:33.640202 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-99.ec2.internal\" not found" Mar 18 16:44:33.652810 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:33.652784 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 16:44:33.652989 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:33.652968 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 18 16:44:33.719211 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:33.719130 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:33.741099 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:33.741064 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-99.ec2.internal\" not found" Mar 18 16:44:33.742170 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:33.742156 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Mar 18 16:44:33.751722 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:33.751696 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:44:33.769996 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:33.769956 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-03-17 16:39:32 +0000 UTC" deadline="2027-12-11 05:00:23.55439119 +0000 UTC" Mar 18 16:44:33.769996 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:33.769991 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15180h15m49.784403582s" Mar 18 16:44:33.774376 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:33.774352 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-bhkw7" Mar 18 16:44:33.782410 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:33.782385 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-bhkw7" Mar 18 16:44:33.842883 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:33.842849 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-99.ec2.internal\" not found" Mar 18 16:44:33.906979 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:33.906928 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccad537f0fa817f43bb3d0d1231fcf27.slice/crio-ccb55231852f3993d1f2fb75f96bfaf869ec2f326547d6ff1685fb8bd61db20f WatchSource:0}: Error finding container ccb55231852f3993d1f2fb75f96bfaf869ec2f326547d6ff1685fb8bd61db20f: Status 404 returned error can't find the container with id ccb55231852f3993d1f2fb75f96bfaf869ec2f326547d6ff1685fb8bd61db20f Mar 18 16:44:33.907360 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:33.907334 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod760d2cc06861e4c286d713e86b836b9f.slice/crio-94a919ea44dbe27e767a464bd509ef918523b34fe6fb7c4b9c6cd8c720c6d336 WatchSource:0}: Error finding container 94a919ea44dbe27e767a464bd509ef918523b34fe6fb7c4b9c6cd8c720c6d336: Status 404 returned error can't find the container with id 94a919ea44dbe27e767a464bd509ef918523b34fe6fb7c4b9c6cd8c720c6d336 Mar 18 16:44:33.911301 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:33.911285 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:44:33.943562 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:33.943524 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-99.ec2.internal\" not found" Mar 18 16:44:34.044101 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:34.044020 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-99.ec2.internal\" not found" Mar 18 16:44:34.144511 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:34.144475 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-99.ec2.internal\" not found" Mar 18 16:44:34.231761 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.231732 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:34.241920 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.241885 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-99.ec2.internal" Mar 18 16:44:34.255080 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.255052 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:44:34.256999 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.256983 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-99.ec2.internal" Mar 18 16:44:34.265948 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.265921 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:44:34.660446 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.660416 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:34.720925 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.720877 2573 apiserver.go:52] "Watching apiserver" Mar 18 16:44:34.729414 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.729359 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Mar 18 16:44:34.732011 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.731959 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7","openshift-cluster-node-tuning-operator/tuned-wcssc","openshift-image-registry/node-ca-fxsfb","openshift-multus/multus-additional-cni-plugins-4npwk","openshift-multus/multus-zdh9l","openshift-multus/network-metrics-daemon-wtbdl","openshift-network-operator/iptables-alerter-2rst4","kube-system/kube-apiserver-proxy-ip-10-0-135-99.ec2.internal","openshift-dns/node-resolver-8dd92","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-99.ec2.internal","openshift-network-diagnostics/network-check-target-9smhv","openshift-ovn-kubernetes/ovnkube-node-qzdnc","kube-system/konnectivity-agent-jndjx"] Mar 18 16:44:34.733772 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.733730 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fxsfb" Mar 18 16:44:34.736425 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.736167 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" Mar 18 16:44:34.736425 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.736354 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2rst4" Mar 18 16:44:34.736973 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.736925 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-pxc2z\"" Mar 18 16:44:34.736973 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.736953 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Mar 18 16:44:34.737355 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.737279 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Mar 18 16:44:34.737355 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.737314 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Mar 18 16:44:34.739080 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.738691 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.739080 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.738801 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Mar 18 16:44:34.739223 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.739192 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Mar 18 16:44:34.739282 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.739240 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-mcmfr\"" Mar 18 16:44:34.739355 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.739331 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-czlpq\"" Mar 18 16:44:34.739650 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.739426 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Mar 18 16:44:34.739650 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.739540 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Mar 18 16:44:34.739797 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.739772 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Mar 18 16:44:34.740103 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.740083 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.740327 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.740220 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:34.740394 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:34.740319 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtbdl" podUID="52f2a3f3-56d7-41f7-8bed-9e7229d96408" Mar 18 16:44:34.740651 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.740546 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:44:34.741041 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.741021 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Mar 18 16:44:34.741396 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.741327 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Mar 18 16:44:34.741497 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.741480 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.741964 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.741730 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Mar 18 16:44:34.742347 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.742058 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Mar 18 16:44:34.742347 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.742075 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Mar 18 16:44:34.742347 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.742090 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dzv4d\"" Mar 18 16:44:34.742604 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.742574 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Mar 18 16:44:34.744202 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.744181 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-bc7nv\"" Mar 18 16:44:34.744332 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.744294 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:44:34.744429 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:34.744361 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9smhv" podUID="8b451861-a208-430f-840a-bce654bef71f" Mar 18 16:44:34.744490 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.744454 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:44:34.744882 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.744671 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Mar 18 16:44:34.744882 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.744706 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-srxh7\"" Mar 18 16:44:34.746036 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.746015 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8dd92" Mar 18 16:44:34.747688 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.747667 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jndjx" Mar 18 16:44:34.747851 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.747783 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.748492 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.748464 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Mar 18 16:44:34.748635 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.748616 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-67h2l\"" Mar 18 16:44:34.748711 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.748674 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Mar 18 16:44:34.750765 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.750745 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Mar 18 16:44:34.750873 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.750806 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Mar 18 16:44:34.751031 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.751015 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Mar 18 16:44:34.751158 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.751136 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Mar 18 16:44:34.751231 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.751024 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Mar 18 16:44:34.751231 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.751202 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Mar 18 16:44:34.751323 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.751075 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gt48b\"" Mar 18 16:44:34.751323 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.751019 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Mar 18 16:44:34.751474 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.751453 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hxg5b\"" Mar 18 16:44:34.752045 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.752015 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Mar 18 16:44:34.754154 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.754072 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/02c83393-6d45-45b3-bfed-bccf6afbfe33-registration-dir\") pod \"aws-ebs-csi-driver-node-8h4n7\" (UID: \"02c83393-6d45-45b3-bfed-bccf6afbfe33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" Mar 18 16:44:34.754154 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.754107 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/02c83393-6d45-45b3-bfed-bccf6afbfe33-device-dir\") pod \"aws-ebs-csi-driver-node-8h4n7\" (UID: \"02c83393-6d45-45b3-bfed-bccf6afbfe33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" Mar 18 16:44:34.754299 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.754158 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-cni-binary-copy\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.754299 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.754193 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-host-var-lib-cni-multus\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.754299 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.754261 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-host\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.754447 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.754311 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-tmp\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.754447 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.754339 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ef1bc9f1-f14a-4cb6-8631-d7ab65c971c7-tmp-dir\") pod \"node-resolver-8dd92\" (UID: \"ef1bc9f1-f14a-4cb6-8631-d7ab65c971c7\") " pod="openshift-dns/node-resolver-8dd92" Mar 18 16:44:34.754447 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.754380 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7b009b7f-519d-4077-b100-93c7b9934af9-cni-binary-copy\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.754447 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.754406 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b2f54d7b-6035-42a3-98a8-3ab8e13ee1ab-iptables-alerter-script\") pod \"iptables-alerter-2rst4\" (UID: \"b2f54d7b-6035-42a3-98a8-3ab8e13ee1ab\") " pod="openshift-network-operator/iptables-alerter-2rst4" Mar 18 16:44:34.754447 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.754444 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-lib-modules\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.754676 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.754467 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb98c083-6ae5-4745-a6be-ff841741f1f6-host\") pod \"node-ca-fxsfb\" (UID: \"eb98c083-6ae5-4745-a6be-ff841741f1f6\") " pod="openshift-image-registry/node-ca-fxsfb" Mar 18 16:44:34.754676 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.754530 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs\") pod \"network-metrics-daemon-wtbdl\" (UID: \"52f2a3f3-56d7-41f7-8bed-9e7229d96408\") " pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:34.754676 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.754557 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-host-run-netns\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.754816 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.754700 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-run\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.754816 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.754724 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-etc-tuned\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.754816 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.754803 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ef1bc9f1-f14a-4cb6-8631-d7ab65c971c7-hosts-file\") pod \"node-resolver-8dd92\" (UID: \"ef1bc9f1-f14a-4cb6-8631-d7ab65c971c7\") " pod="openshift-dns/node-resolver-8dd92" Mar 18 16:44:34.754981 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.754826 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx29j\" (UniqueName: \"kubernetes.io/projected/eb98c083-6ae5-4745-a6be-ff841741f1f6-kube-api-access-xx29j\") pod \"node-ca-fxsfb\" (UID: \"eb98c083-6ae5-4745-a6be-ff841741f1f6\") " pod="openshift-image-registry/node-ca-fxsfb" Mar 18 16:44:34.754981 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.754851 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-multus-socket-dir-parent\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.754981 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.754873 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cz9m\" (UniqueName: \"kubernetes.io/projected/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-kube-api-access-5cz9m\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.754981 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.754898 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-etc-kubernetes\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.754981 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.754920 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-host-var-lib-kubelet\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.754981 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.754971 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7b009b7f-519d-4077-b100-93c7b9934af9-os-release\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.755273 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.754998 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7b009b7f-519d-4077-b100-93c7b9934af9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.755273 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.755022 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/02c83393-6d45-45b3-bfed-bccf6afbfe33-etc-selinux\") pod \"aws-ebs-csi-driver-node-8h4n7\" (UID: \"02c83393-6d45-45b3-bfed-bccf6afbfe33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" Mar 18 16:44:34.755273 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.755044 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-multus-conf-dir\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.755273 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.755076 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-host-run-multus-certs\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.755273 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.755102 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-etc-sysctl-conf\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.755273 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.755157 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b2f54d7b-6035-42a3-98a8-3ab8e13ee1ab-host-slash\") pod \"iptables-alerter-2rst4\" (UID: \"b2f54d7b-6035-42a3-98a8-3ab8e13ee1ab\") " pod="openshift-network-operator/iptables-alerter-2rst4" Mar 18 16:44:34.755273 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.755209 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-multus-daemon-config\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.755990 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.755964 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-etc-systemd\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.756073 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.756021 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qxs5\" (UniqueName: \"kubernetes.io/projected/52f2a3f3-56d7-41f7-8bed-9e7229d96408-kube-api-access-9qxs5\") pod \"network-metrics-daemon-wtbdl\" (UID: \"52f2a3f3-56d7-41f7-8bed-9e7229d96408\") " pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:34.756073 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.756057 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-var-lib-kubelet\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.756302 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.756282 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-system-cni-dir\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.756375 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.756320 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-sys\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.756375 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.756354 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-cnibin\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.756467 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.756385 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-host-var-lib-cni-bin\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.756467 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.756417 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-etc-sysconfig\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.756564 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.756467 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/02c83393-6d45-45b3-bfed-bccf6afbfe33-sys-fs\") pod \"aws-ebs-csi-driver-node-8h4n7\" (UID: \"02c83393-6d45-45b3-bfed-bccf6afbfe33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" Mar 18 16:44:34.756564 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.756524 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7b009b7f-519d-4077-b100-93c7b9934af9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.756659 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.756571 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-os-release\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.756659 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.756599 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-etc-kubernetes\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.756659 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.756632 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b009b7f-519d-4077-b100-93c7b9934af9-system-cni-dir\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.756794 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.756663 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/02c83393-6d45-45b3-bfed-bccf6afbfe33-socket-dir\") pod \"aws-ebs-csi-driver-node-8h4n7\" (UID: \"02c83393-6d45-45b3-bfed-bccf6afbfe33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" Mar 18 16:44:34.756794 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.756692 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf979\" (UniqueName: \"kubernetes.io/projected/b2f54d7b-6035-42a3-98a8-3ab8e13ee1ab-kube-api-access-mf979\") pod \"iptables-alerter-2rst4\" (UID: \"b2f54d7b-6035-42a3-98a8-3ab8e13ee1ab\") " pod="openshift-network-operator/iptables-alerter-2rst4" Mar 18 16:44:34.756794 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.756711 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-multus-cni-dir\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.756928 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.756804 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-etc-sysctl-d\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.756928 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.756851 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjblx\" (UniqueName: \"kubernetes.io/projected/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-kube-api-access-sjblx\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.756928 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.756896 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-host-run-k8s-cni-cncf-io\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.757095 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.757007 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-etc-modprobe-d\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.757095 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.757072 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7b009b7f-519d-4077-b100-93c7b9934af9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.757184 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.757143 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7b009b7f-519d-4077-b100-93c7b9934af9-cnibin\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.757184 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.757176 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/02c83393-6d45-45b3-bfed-bccf6afbfe33-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8h4n7\" (UID: \"02c83393-6d45-45b3-bfed-bccf6afbfe33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" Mar 18 16:44:34.757272 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.757204 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j75xw\" (UniqueName: \"kubernetes.io/projected/02c83393-6d45-45b3-bfed-bccf6afbfe33-kube-api-access-j75xw\") pod \"aws-ebs-csi-driver-node-8h4n7\" (UID: \"02c83393-6d45-45b3-bfed-bccf6afbfe33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" Mar 18 16:44:34.757413 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.757391 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-hostroot\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.757480 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.757439 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2c9m\" (UniqueName: \"kubernetes.io/projected/ef1bc9f1-f14a-4cb6-8631-d7ab65c971c7-kube-api-access-c2c9m\") pod \"node-resolver-8dd92\" (UID: \"ef1bc9f1-f14a-4cb6-8631-d7ab65c971c7\") " pod="openshift-dns/node-resolver-8dd92" Mar 18 16:44:34.757480 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.757474 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb98c083-6ae5-4745-a6be-ff841741f1f6-serviceca\") pod \"node-ca-fxsfb\" (UID: \"eb98c083-6ae5-4745-a6be-ff841741f1f6\") " pod="openshift-image-registry/node-ca-fxsfb" Mar 18 16:44:34.757571 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.757537 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccn6f\" (UniqueName: \"kubernetes.io/projected/7b009b7f-519d-4077-b100-93c7b9934af9-kube-api-access-ccn6f\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.783769 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.783730 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:39:33 +0000 UTC" deadline="2027-08-17 01:22:26.78731285 +0000 UTC" Mar 18 16:44:34.783769 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.783766 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12392h37m52.003550221s" Mar 18 16:44:34.843210 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.843176 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 18 16:44:34.858584 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.858550 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6018dac-2f72-43d9-b554-dffe8cf976c4-ovn-node-metrics-cert\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.858761 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.858596 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-host-run-netns\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.858761 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.858622 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-run\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.858761 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.858645 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-etc-tuned\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.858761 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.858670 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ef1bc9f1-f14a-4cb6-8631-d7ab65c971c7-hosts-file\") pod \"node-resolver-8dd92\" (UID: \"ef1bc9f1-f14a-4cb6-8631-d7ab65c971c7\") " pod="openshift-dns/node-resolver-8dd92" Mar 18 16:44:34.858761 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.858688 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-run\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.858761 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.858697 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xx29j\" (UniqueName: \"kubernetes.io/projected/eb98c083-6ae5-4745-a6be-ff841741f1f6-kube-api-access-xx29j\") pod \"node-ca-fxsfb\" (UID: \"eb98c083-6ae5-4745-a6be-ff841741f1f6\") " pod="openshift-image-registry/node-ca-fxsfb" Mar 18 16:44:34.858761 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.858666 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-host-run-netns\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.858761 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.858729 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-host-cni-netd\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.858761 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.858747 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ef1bc9f1-f14a-4cb6-8631-d7ab65c971c7-hosts-file\") pod \"node-resolver-8dd92\" (UID: \"ef1bc9f1-f14a-4cb6-8631-d7ab65c971c7\") " pod="openshift-dns/node-resolver-8dd92" Mar 18 16:44:34.859212 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.858851 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-multus-socket-dir-parent\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.859212 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.858888 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5cz9m\" (UniqueName: \"kubernetes.io/projected/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-kube-api-access-5cz9m\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.859212 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.858915 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-etc-kubernetes\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.859212 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.858935 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-multus-socket-dir-parent\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.859212 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.858959 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c6018dac-2f72-43d9-b554-dffe8cf976c4-ovnkube-script-lib\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.859212 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.858988 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-host-var-lib-kubelet\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.859212 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859012 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7b009b7f-519d-4077-b100-93c7b9934af9-os-release\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.859212 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859018 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-etc-kubernetes\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.859212 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859036 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7b009b7f-519d-4077-b100-93c7b9934af9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.859212 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859030 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 16:44:34.859212 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859054 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-host-var-lib-kubelet\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.859212 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859078 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/02c83393-6d45-45b3-bfed-bccf6afbfe33-etc-selinux\") pod \"aws-ebs-csi-driver-node-8h4n7\" (UID: \"02c83393-6d45-45b3-bfed-bccf6afbfe33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" Mar 18 16:44:34.859212 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859087 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7b009b7f-519d-4077-b100-93c7b9934af9-os-release\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.859212 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859112 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-multus-conf-dir\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.859212 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859140 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-host-run-multus-certs\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.859212 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859152 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7b009b7f-519d-4077-b100-93c7b9934af9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.859212 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859167 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-etc-sysctl-conf\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.859212 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859178 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-multus-conf-dir\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.860043 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859190 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/02c83393-6d45-45b3-bfed-bccf6afbfe33-etc-selinux\") pod \"aws-ebs-csi-driver-node-8h4n7\" (UID: \"02c83393-6d45-45b3-bfed-bccf6afbfe33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" Mar 18 16:44:34.860043 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859193 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-host-run-multus-certs\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.860043 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859198 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c6018dac-2f72-43d9-b554-dffe8cf976c4-env-overrides\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.860043 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859241 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b2f54d7b-6035-42a3-98a8-3ab8e13ee1ab-host-slash\") pod \"iptables-alerter-2rst4\" (UID: \"b2f54d7b-6035-42a3-98a8-3ab8e13ee1ab\") " pod="openshift-network-operator/iptables-alerter-2rst4" Mar 18 16:44:34.860043 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859271 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-multus-daemon-config\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.860043 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859291 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b2f54d7b-6035-42a3-98a8-3ab8e13ee1ab-host-slash\") pod \"iptables-alerter-2rst4\" (UID: \"b2f54d7b-6035-42a3-98a8-3ab8e13ee1ab\") " pod="openshift-network-operator/iptables-alerter-2rst4" Mar 18 16:44:34.860043 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859297 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-etc-systemd\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.860043 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859295 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-etc-sysctl-conf\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.860043 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859345 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-etc-systemd\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.860043 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859352 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qxs5\" (UniqueName: \"kubernetes.io/projected/52f2a3f3-56d7-41f7-8bed-9e7229d96408-kube-api-access-9qxs5\") pod \"network-metrics-daemon-wtbdl\" (UID: \"52f2a3f3-56d7-41f7-8bed-9e7229d96408\") " pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:34.860043 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859384 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-log-socket\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.860043 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859411 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-var-lib-kubelet\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.860043 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859440 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-system-cni-dir\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.860043 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859464 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-sys\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.860043 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859482 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-var-lib-kubelet\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.860043 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859490 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-systemd-units\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.860043 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859520 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.860798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859548 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbbxq\" (UniqueName: \"kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq\") pod \"network-check-target-9smhv\" (UID: \"8b451861-a208-430f-840a-bce654bef71f\") " pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:44:34.860798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859551 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-system-cni-dir\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.860798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859579 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-sys\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.860798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859577 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-cnibin\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.860798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859630 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-host-var-lib-cni-bin\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.860798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859638 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-cnibin\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.860798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859656 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-etc-sysconfig\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.860798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859675 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-host-var-lib-cni-bin\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.860798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859686 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-host-slash\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.860798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859709 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-run-systemd\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.860798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859710 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-etc-sysconfig\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.860798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859735 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-run-ovn\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.860798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859779 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/02c83393-6d45-45b3-bfed-bccf6afbfe33-sys-fs\") pod \"aws-ebs-csi-driver-node-8h4n7\" (UID: \"02c83393-6d45-45b3-bfed-bccf6afbfe33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" Mar 18 16:44:34.860798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859850 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7b009b7f-519d-4077-b100-93c7b9934af9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.860798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859872 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/02c83393-6d45-45b3-bfed-bccf6afbfe33-sys-fs\") pod \"aws-ebs-csi-driver-node-8h4n7\" (UID: \"02c83393-6d45-45b3-bfed-bccf6afbfe33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" Mar 18 16:44:34.860798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859881 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-host-run-netns\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.860798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859907 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-var-lib-openvswitch\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.861540 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859963 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-os-release\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.861540 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.859991 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-etc-kubernetes\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.861540 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860016 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b009b7f-519d-4077-b100-93c7b9934af9-system-cni-dir\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.861540 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860083 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2f0c6c9c-5cbc-4d92-9fee-ad30b7cb88a6-konnectivity-ca\") pod \"konnectivity-agent-jndjx\" (UID: \"2f0c6c9c-5cbc-4d92-9fee-ad30b7cb88a6\") " pod="kube-system/konnectivity-agent-jndjx" Mar 18 16:44:34.861540 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860091 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-os-release\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.861540 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860121 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-etc-kubernetes\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.861540 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860152 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b009b7f-519d-4077-b100-93c7b9934af9-system-cni-dir\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.861540 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860175 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/02c83393-6d45-45b3-bfed-bccf6afbfe33-socket-dir\") pod \"aws-ebs-csi-driver-node-8h4n7\" (UID: \"02c83393-6d45-45b3-bfed-bccf6afbfe33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" Mar 18 16:44:34.861540 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860205 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mf979\" (UniqueName: \"kubernetes.io/projected/b2f54d7b-6035-42a3-98a8-3ab8e13ee1ab-kube-api-access-mf979\") pod \"iptables-alerter-2rst4\" (UID: \"b2f54d7b-6035-42a3-98a8-3ab8e13ee1ab\") " pod="openshift-network-operator/iptables-alerter-2rst4" Mar 18 16:44:34.861540 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860232 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-multus-cni-dir\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.861540 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860258 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-etc-sysctl-d\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.861540 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860280 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/02c83393-6d45-45b3-bfed-bccf6afbfe33-socket-dir\") pod \"aws-ebs-csi-driver-node-8h4n7\" (UID: \"02c83393-6d45-45b3-bfed-bccf6afbfe33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" Mar 18 16:44:34.861540 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860303 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjblx\" (UniqueName: \"kubernetes.io/projected/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-kube-api-access-sjblx\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.861540 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860372 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-host-run-k8s-cni-cncf-io\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.861540 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860400 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-etc-modprobe-d\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.861540 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860412 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-multus-cni-dir\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.861540 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860416 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7b009b7f-519d-4077-b100-93c7b9934af9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.862149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860426 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7b009b7f-519d-4077-b100-93c7b9934af9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.862149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860527 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-etc-sysctl-d\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.862149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860573 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-host-run-k8s-cni-cncf-io\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.862149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860603 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-multus-daemon-config\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.862149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860597 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-etc-modprobe-d\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.862149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860623 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7b009b7f-519d-4077-b100-93c7b9934af9-cnibin\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.862149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860663 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7b009b7f-519d-4077-b100-93c7b9934af9-cnibin\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.862149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860706 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/02c83393-6d45-45b3-bfed-bccf6afbfe33-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8h4n7\" (UID: \"02c83393-6d45-45b3-bfed-bccf6afbfe33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" Mar 18 16:44:34.862149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860735 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j75xw\" (UniqueName: \"kubernetes.io/projected/02c83393-6d45-45b3-bfed-bccf6afbfe33-kube-api-access-j75xw\") pod \"aws-ebs-csi-driver-node-8h4n7\" (UID: \"02c83393-6d45-45b3-bfed-bccf6afbfe33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" Mar 18 16:44:34.862149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860752 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/02c83393-6d45-45b3-bfed-bccf6afbfe33-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8h4n7\" (UID: \"02c83393-6d45-45b3-bfed-bccf6afbfe33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" Mar 18 16:44:34.862149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860763 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-hostroot\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.862149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860788 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2c9m\" (UniqueName: \"kubernetes.io/projected/ef1bc9f1-f14a-4cb6-8631-d7ab65c971c7-kube-api-access-c2c9m\") pod \"node-resolver-8dd92\" (UID: \"ef1bc9f1-f14a-4cb6-8631-d7ab65c971c7\") " pod="openshift-dns/node-resolver-8dd92" Mar 18 16:44:34.862149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860814 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb98c083-6ae5-4745-a6be-ff841741f1f6-serviceca\") pod \"node-ca-fxsfb\" (UID: \"eb98c083-6ae5-4745-a6be-ff841741f1f6\") " pod="openshift-image-registry/node-ca-fxsfb" Mar 18 16:44:34.862149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860834 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ccn6f\" (UniqueName: \"kubernetes.io/projected/7b009b7f-519d-4077-b100-93c7b9934af9-kube-api-access-ccn6f\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.862149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860866 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-etc-openvswitch\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.862149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860896 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7b009b7f-519d-4077-b100-93c7b9934af9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.862149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860906 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/02c83393-6d45-45b3-bfed-bccf6afbfe33-registration-dir\") pod \"aws-ebs-csi-driver-node-8h4n7\" (UID: \"02c83393-6d45-45b3-bfed-bccf6afbfe33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" Mar 18 16:44:34.862693 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860976 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/02c83393-6d45-45b3-bfed-bccf6afbfe33-registration-dir\") pod \"aws-ebs-csi-driver-node-8h4n7\" (UID: \"02c83393-6d45-45b3-bfed-bccf6afbfe33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" Mar 18 16:44:34.862693 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.860983 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/02c83393-6d45-45b3-bfed-bccf6afbfe33-device-dir\") pod \"aws-ebs-csi-driver-node-8h4n7\" (UID: \"02c83393-6d45-45b3-bfed-bccf6afbfe33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" Mar 18 16:44:34.862693 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861043 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-hostroot\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.862693 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861044 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/02c83393-6d45-45b3-bfed-bccf6afbfe33-device-dir\") pod \"aws-ebs-csi-driver-node-8h4n7\" (UID: \"02c83393-6d45-45b3-bfed-bccf6afbfe33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" Mar 18 16:44:34.862693 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861193 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-cni-binary-copy\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.862693 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861227 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-host-var-lib-cni-multus\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.862693 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861254 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-host\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.862693 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861278 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-tmp\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.862693 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861307 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2f0c6c9c-5cbc-4d92-9fee-ad30b7cb88a6-agent-certs\") pod \"konnectivity-agent-jndjx\" (UID: \"2f0c6c9c-5cbc-4d92-9fee-ad30b7cb88a6\") " pod="kube-system/konnectivity-agent-jndjx" Mar 18 16:44:34.862693 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861333 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-host-run-ovn-kubernetes\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.862693 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861344 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb98c083-6ae5-4745-a6be-ff841741f1f6-serviceca\") pod \"node-ca-fxsfb\" (UID: \"eb98c083-6ae5-4745-a6be-ff841741f1f6\") " pod="openshift-image-registry/node-ca-fxsfb" Mar 18 16:44:34.862693 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861344 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-host\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.862693 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861362 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ef1bc9f1-f14a-4cb6-8631-d7ab65c971c7-tmp-dir\") pod \"node-resolver-8dd92\" (UID: \"ef1bc9f1-f14a-4cb6-8631-d7ab65c971c7\") " pod="openshift-dns/node-resolver-8dd92" Mar 18 16:44:34.862693 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861394 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7b009b7f-519d-4077-b100-93c7b9934af9-cni-binary-copy\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.862693 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861422 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-host-kubelet\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.862693 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861455 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-run-openvswitch\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.862693 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861484 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c6018dac-2f72-43d9-b554-dffe8cf976c4-ovnkube-config\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.863247 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861511 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxs4r\" (UniqueName: \"kubernetes.io/projected/c6018dac-2f72-43d9-b554-dffe8cf976c4-kube-api-access-lxs4r\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.863247 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861538 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b2f54d7b-6035-42a3-98a8-3ab8e13ee1ab-iptables-alerter-script\") pod \"iptables-alerter-2rst4\" (UID: \"b2f54d7b-6035-42a3-98a8-3ab8e13ee1ab\") " pod="openshift-network-operator/iptables-alerter-2rst4" Mar 18 16:44:34.863247 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861567 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-lib-modules\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.863247 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861590 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb98c083-6ae5-4745-a6be-ff841741f1f6-host\") pod \"node-ca-fxsfb\" (UID: \"eb98c083-6ae5-4745-a6be-ff841741f1f6\") " pod="openshift-image-registry/node-ca-fxsfb" Mar 18 16:44:34.863247 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861618 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs\") pod \"network-metrics-daemon-wtbdl\" (UID: \"52f2a3f3-56d7-41f7-8bed-9e7229d96408\") " pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:34.863247 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861645 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-node-log\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.863247 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861654 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ef1bc9f1-f14a-4cb6-8631-d7ab65c971c7-tmp-dir\") pod \"node-resolver-8dd92\" (UID: \"ef1bc9f1-f14a-4cb6-8631-d7ab65c971c7\") " pod="openshift-dns/node-resolver-8dd92" Mar 18 16:44:34.863247 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861672 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-host-cni-bin\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.863247 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861729 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-host-var-lib-cni-multus\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.863247 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861738 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-cni-binary-copy\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.863247 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861792 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb98c083-6ae5-4745-a6be-ff841741f1f6-host\") pod \"node-ca-fxsfb\" (UID: \"eb98c083-6ae5-4745-a6be-ff841741f1f6\") " pod="openshift-image-registry/node-ca-fxsfb" Mar 18 16:44:34.863247 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861848 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-lib-modules\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.863247 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.861901 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7b009b7f-519d-4077-b100-93c7b9934af9-cni-binary-copy\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.863247 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:34.861967 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:34.863247 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:34.862067 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs podName:52f2a3f3-56d7-41f7-8bed-9e7229d96408 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:35.362035887 +0000 UTC m=+3.077458881 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs") pod "network-metrics-daemon-wtbdl" (UID: "52f2a3f3-56d7-41f7-8bed-9e7229d96408") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:34.863247 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.862266 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b2f54d7b-6035-42a3-98a8-3ab8e13ee1ab-iptables-alerter-script\") pod \"iptables-alerter-2rst4\" (UID: \"b2f54d7b-6035-42a3-98a8-3ab8e13ee1ab\") " pod="openshift-network-operator/iptables-alerter-2rst4" Mar 18 16:44:34.863247 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.862635 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-etc-tuned\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.864619 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.864583 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-tmp\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.871809 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.871603 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qxs5\" (UniqueName: \"kubernetes.io/projected/52f2a3f3-56d7-41f7-8bed-9e7229d96408-kube-api-access-9qxs5\") pod \"network-metrics-daemon-wtbdl\" (UID: \"52f2a3f3-56d7-41f7-8bed-9e7229d96408\") " pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:34.871809 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.871759 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx29j\" (UniqueName: \"kubernetes.io/projected/eb98c083-6ae5-4745-a6be-ff841741f1f6-kube-api-access-xx29j\") pod \"node-ca-fxsfb\" (UID: \"eb98c083-6ae5-4745-a6be-ff841741f1f6\") " pod="openshift-image-registry/node-ca-fxsfb" Mar 18 16:44:34.871809 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.871794 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2c9m\" (UniqueName: \"kubernetes.io/projected/ef1bc9f1-f14a-4cb6-8631-d7ab65c971c7-kube-api-access-c2c9m\") pod \"node-resolver-8dd92\" (UID: \"ef1bc9f1-f14a-4cb6-8631-d7ab65c971c7\") " pod="openshift-dns/node-resolver-8dd92" Mar 18 16:44:34.872233 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.872204 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccn6f\" (UniqueName: \"kubernetes.io/projected/7b009b7f-519d-4077-b100-93c7b9934af9-kube-api-access-ccn6f\") pod \"multus-additional-cni-plugins-4npwk\" (UID: \"7b009b7f-519d-4077-b100-93c7b9934af9\") " pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:34.872426 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.872371 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf979\" (UniqueName: \"kubernetes.io/projected/b2f54d7b-6035-42a3-98a8-3ab8e13ee1ab-kube-api-access-mf979\") pod \"iptables-alerter-2rst4\" (UID: \"b2f54d7b-6035-42a3-98a8-3ab8e13ee1ab\") " pod="openshift-network-operator/iptables-alerter-2rst4" Mar 18 16:44:34.873026 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.873002 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j75xw\" (UniqueName: \"kubernetes.io/projected/02c83393-6d45-45b3-bfed-bccf6afbfe33-kube-api-access-j75xw\") pod \"aws-ebs-csi-driver-node-8h4n7\" (UID: \"02c83393-6d45-45b3-bfed-bccf6afbfe33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" Mar 18 16:44:34.873233 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.873214 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjblx\" (UniqueName: \"kubernetes.io/projected/705f7aec-1965-4cf4-8f5d-218a3bc94e1a-kube-api-access-sjblx\") pod \"tuned-wcssc\" (UID: \"705f7aec-1965-4cf4-8f5d-218a3bc94e1a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:34.873564 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.873546 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cz9m\" (UniqueName: \"kubernetes.io/projected/6bd0541f-f19e-4cdf-b03e-55eabcf75d7e-kube-api-access-5cz9m\") pod \"multus-zdh9l\" (UID: \"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e\") " pod="openshift-multus/multus-zdh9l" Mar 18 16:44:34.888215 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.888162 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-99.ec2.internal" event={"ID":"ccad537f0fa817f43bb3d0d1231fcf27","Type":"ContainerStarted","Data":"ccb55231852f3993d1f2fb75f96bfaf869ec2f326547d6ff1685fb8bd61db20f"} Mar 18 16:44:34.889366 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.889320 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-99.ec2.internal" event={"ID":"760d2cc06861e4c286d713e86b836b9f","Type":"ContainerStarted","Data":"94a919ea44dbe27e767a464bd509ef918523b34fe6fb7c4b9c6cd8c720c6d336"} Mar 18 16:44:34.962147 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962110 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-log-socket\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.962147 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962145 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-systemd-units\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.962394 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962166 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.962394 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962208 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.962394 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962224 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-log-socket\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.962394 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962230 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-systemd-units\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.962394 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962256 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbbxq\" (UniqueName: \"kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq\") pod \"network-check-target-9smhv\" (UID: \"8b451861-a208-430f-840a-bce654bef71f\") " pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:44:34.962538 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962419 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-host-slash\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.962579 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962563 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-host-slash\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.962613 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962603 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-run-systemd\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.962648 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962623 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-run-ovn\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.962648 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962641 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-host-run-netns\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.962732 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962659 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-var-lib-openvswitch\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.962732 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962696 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-run-ovn\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.962732 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962721 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-run-systemd\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.962848 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962738 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-var-lib-openvswitch\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.962848 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962697 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-host-run-netns\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.962848 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962696 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2f0c6c9c-5cbc-4d92-9fee-ad30b7cb88a6-konnectivity-ca\") pod \"konnectivity-agent-jndjx\" (UID: \"2f0c6c9c-5cbc-4d92-9fee-ad30b7cb88a6\") " pod="kube-system/konnectivity-agent-jndjx" Mar 18 16:44:34.962848 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962809 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-etc-openvswitch\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.962848 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962838 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2f0c6c9c-5cbc-4d92-9fee-ad30b7cb88a6-agent-certs\") pod \"konnectivity-agent-jndjx\" (UID: \"2f0c6c9c-5cbc-4d92-9fee-ad30b7cb88a6\") " pod="kube-system/konnectivity-agent-jndjx" Mar 18 16:44:34.963098 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962854 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-host-run-ovn-kubernetes\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.963098 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962855 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-etc-openvswitch\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.963098 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962879 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-host-kubelet\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.963098 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962896 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-run-openvswitch\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.963098 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962918 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c6018dac-2f72-43d9-b554-dffe8cf976c4-ovnkube-config\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.963098 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962958 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxs4r\" (UniqueName: \"kubernetes.io/projected/c6018dac-2f72-43d9-b554-dffe8cf976c4-kube-api-access-lxs4r\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.963098 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962961 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-host-run-ovn-kubernetes\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.963098 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962981 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-run-openvswitch\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.963098 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.962968 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-host-kubelet\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.963098 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.963017 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-node-log\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.963098 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.963043 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-host-cni-bin\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.963098 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.963068 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6018dac-2f72-43d9-b554-dffe8cf976c4-ovn-node-metrics-cert\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.963098 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.963089 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-node-log\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.963098 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.963099 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-host-cni-netd\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.963607 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.963116 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-host-cni-bin\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.963607 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.963135 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c6018dac-2f72-43d9-b554-dffe8cf976c4-host-cni-netd\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.963607 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.963143 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c6018dac-2f72-43d9-b554-dffe8cf976c4-ovnkube-script-lib\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.963607 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.963177 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c6018dac-2f72-43d9-b554-dffe8cf976c4-env-overrides\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.963607 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.963220 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2f0c6c9c-5cbc-4d92-9fee-ad30b7cb88a6-konnectivity-ca\") pod \"konnectivity-agent-jndjx\" (UID: \"2f0c6c9c-5cbc-4d92-9fee-ad30b7cb88a6\") " pod="kube-system/konnectivity-agent-jndjx" Mar 18 16:44:34.963764 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.963682 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c6018dac-2f72-43d9-b554-dffe8cf976c4-env-overrides\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.963764 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.963725 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c6018dac-2f72-43d9-b554-dffe8cf976c4-ovnkube-script-lib\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.964133 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.964111 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c6018dac-2f72-43d9-b554-dffe8cf976c4-ovnkube-config\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.965217 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.965195 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2f0c6c9c-5cbc-4d92-9fee-ad30b7cb88a6-agent-certs\") pod \"konnectivity-agent-jndjx\" (UID: \"2f0c6c9c-5cbc-4d92-9fee-ad30b7cb88a6\") " pod="kube-system/konnectivity-agent-jndjx" Mar 18 16:44:34.965479 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.965459 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6018dac-2f72-43d9-b554-dffe8cf976c4-ovn-node-metrics-cert\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:34.968537 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:34.968511 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:34.968537 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:34.968537 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:34.968709 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:34.968553 2573 projected.go:194] Error preparing data for projected volume kube-api-access-hbbxq for pod openshift-network-diagnostics/network-check-target-9smhv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:34.968709 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:34.968681 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq podName:8b451861-a208-430f-840a-bce654bef71f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:35.468661526 +0000 UTC m=+3.184084521 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hbbxq" (UniqueName: "kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq") pod "network-check-target-9smhv" (UID: "8b451861-a208-430f-840a-bce654bef71f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:34.971107 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:34.971088 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxs4r\" (UniqueName: \"kubernetes.io/projected/c6018dac-2f72-43d9-b554-dffe8cf976c4-kube-api-access-lxs4r\") pod \"ovnkube-node-qzdnc\" (UID: \"c6018dac-2f72-43d9-b554-dffe8cf976c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:35.047445 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.047410 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fxsfb" Mar 18 16:44:35.057371 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.057341 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" Mar 18 16:44:35.069607 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.069566 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2rst4" Mar 18 16:44:35.074433 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.074402 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4npwk" Mar 18 16:44:35.088212 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.088174 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zdh9l" Mar 18 16:44:35.096329 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.096301 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wcssc" Mar 18 16:44:35.103008 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.102978 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8dd92" Mar 18 16:44:35.112651 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.112620 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jndjx" Mar 18 16:44:35.118526 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.118500 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:35.198278 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.198254 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:35.365555 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.365472 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs\") pod \"network-metrics-daemon-wtbdl\" (UID: \"52f2a3f3-56d7-41f7-8bed-9e7229d96408\") " pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:35.365711 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:35.365624 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:35.365711 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:35.365706 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs podName:52f2a3f3-56d7-41f7-8bed-9e7229d96408 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:36.365681221 +0000 UTC m=+4.081104203 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs") pod "network-metrics-daemon-wtbdl" (UID: "52f2a3f3-56d7-41f7-8bed-9e7229d96408") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:35.522784 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:35.522752 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b009b7f_519d_4077_b100_93c7b9934af9.slice/crio-4b17ee8692e74b02bc59efa93a27beb00405acf9c90f5f24eb337fecbbfa2067 WatchSource:0}: Error finding container 4b17ee8692e74b02bc59efa93a27beb00405acf9c90f5f24eb337fecbbfa2067: Status 404 returned error can't find the container with id 4b17ee8692e74b02bc59efa93a27beb00405acf9c90f5f24eb337fecbbfa2067 Mar 18 16:44:35.524689 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:35.524482 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bd0541f_f19e_4cdf_b03e_55eabcf75d7e.slice/crio-d63dd1e124c97a7a711e7c120143b062df802098747697083b83efc9d3621d74 WatchSource:0}: Error finding container d63dd1e124c97a7a711e7c120143b062df802098747697083b83efc9d3621d74: Status 404 returned error can't find the container with id d63dd1e124c97a7a711e7c120143b062df802098747697083b83efc9d3621d74 Mar 18 16:44:35.527973 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:35.527931 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6018dac_2f72_43d9_b554_dffe8cf976c4.slice/crio-f3b5e6763ea8feb6cc17cfbcd6ca670daeecbda196c1f003f703595ae765e6ed WatchSource:0}: Error finding container f3b5e6763ea8feb6cc17cfbcd6ca670daeecbda196c1f003f703595ae765e6ed: Status 404 returned error can't find the container with id f3b5e6763ea8feb6cc17cfbcd6ca670daeecbda196c1f003f703595ae765e6ed Mar 18 16:44:35.528849 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:35.528825 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb98c083_6ae5_4745_a6be_ff841741f1f6.slice/crio-3eaef26b65a85dad0224a55506b838330edc0357f7b372f3899f48addafda6ac WatchSource:0}: Error finding container 3eaef26b65a85dad0224a55506b838330edc0357f7b372f3899f48addafda6ac: Status 404 returned error can't find the container with id 3eaef26b65a85dad0224a55506b838330edc0357f7b372f3899f48addafda6ac Mar 18 16:44:35.529926 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:35.529782 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2f54d7b_6035_42a3_98a8_3ab8e13ee1ab.slice/crio-85adda56b6c2138cd5f315baafe7a48ae5ca18cb8bd7143827766f9527f2419a WatchSource:0}: Error finding container 85adda56b6c2138cd5f315baafe7a48ae5ca18cb8bd7143827766f9527f2419a: Status 404 returned error can't find the container with id 85adda56b6c2138cd5f315baafe7a48ae5ca18cb8bd7143827766f9527f2419a Mar 18 16:44:35.532429 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:35.532302 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f0c6c9c_5cbc_4d92_9fee_ad30b7cb88a6.slice/crio-488f415b6acc13e7b9f3810fd3986cd34cbc423654eed85821ef168213ecdaf3 WatchSource:0}: Error finding container 488f415b6acc13e7b9f3810fd3986cd34cbc423654eed85821ef168213ecdaf3: Status 404 returned error can't find the container with id 488f415b6acc13e7b9f3810fd3986cd34cbc423654eed85821ef168213ecdaf3 Mar 18 16:44:35.534096 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:35.534065 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod705f7aec_1965_4cf4_8f5d_218a3bc94e1a.slice/crio-ba78ef4aa9f560532148c999bcb613d578b0b462295f0f5ed894321d62f67884 WatchSource:0}: Error finding container ba78ef4aa9f560532148c999bcb613d578b0b462295f0f5ed894321d62f67884: Status 404 returned error can't find the container with id ba78ef4aa9f560532148c999bcb613d578b0b462295f0f5ed894321d62f67884 Mar 18 16:44:35.536570 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:35.536538 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02c83393_6d45_45b3_bfed_bccf6afbfe33.slice/crio-580eef8cf4df71be1ed6fae5eec0c6af061d8287b8593abf2153af1c04dd2942 WatchSource:0}: Error finding container 580eef8cf4df71be1ed6fae5eec0c6af061d8287b8593abf2153af1c04dd2942: Status 404 returned error can't find the container with id 580eef8cf4df71be1ed6fae5eec0c6af061d8287b8593abf2153af1c04dd2942 Mar 18 16:44:35.537891 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:44:35.537865 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef1bc9f1_f14a_4cb6_8631_d7ab65c971c7.slice/crio-338cee037851f6bc8c6254819c7740cb83a7494ad580bfc5370b02a18eb21a92 WatchSource:0}: Error finding container 338cee037851f6bc8c6254819c7740cb83a7494ad580bfc5370b02a18eb21a92: Status 404 returned error can't find the container with id 338cee037851f6bc8c6254819c7740cb83a7494ad580bfc5370b02a18eb21a92 Mar 18 16:44:35.566227 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.566180 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbbxq\" (UniqueName: \"kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq\") pod \"network-check-target-9smhv\" (UID: \"8b451861-a208-430f-840a-bce654bef71f\") " pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:44:35.566385 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:35.566347 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:35.566385 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:35.566369 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:35.566385 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:35.566382 2573 projected.go:194] Error preparing data for projected volume kube-api-access-hbbxq for pod openshift-network-diagnostics/network-check-target-9smhv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:35.566508 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:35.566430 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq podName:8b451861-a208-430f-840a-bce654bef71f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:36.566416171 +0000 UTC m=+4.281839147 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-hbbxq" (UniqueName: "kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq") pod "network-check-target-9smhv" (UID: "8b451861-a208-430f-840a-bce654bef71f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:35.785166 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.784972 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:39:33 +0000 UTC" deadline="2027-10-09 03:20:50.229386138 +0000 UTC" Mar 18 16:44:35.785166 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.785161 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13666h36m14.444229405s" Mar 18 16:44:35.892477 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.892440 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" event={"ID":"c6018dac-2f72-43d9-b554-dffe8cf976c4","Type":"ContainerStarted","Data":"f3b5e6763ea8feb6cc17cfbcd6ca670daeecbda196c1f003f703595ae765e6ed"} Mar 18 16:44:35.893452 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.893425 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zdh9l" event={"ID":"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e","Type":"ContainerStarted","Data":"d63dd1e124c97a7a711e7c120143b062df802098747697083b83efc9d3621d74"} Mar 18 16:44:35.894623 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.894569 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4npwk" event={"ID":"7b009b7f-519d-4077-b100-93c7b9934af9","Type":"ContainerStarted","Data":"4b17ee8692e74b02bc59efa93a27beb00405acf9c90f5f24eb337fecbbfa2067"} Mar 18 16:44:35.896112 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.896077 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-99.ec2.internal" event={"ID":"ccad537f0fa817f43bb3d0d1231fcf27","Type":"ContainerStarted","Data":"c196e367f7f52c971651a1e3749d80507a2323a18dc8e452a0a4903d1e61eef1"} Mar 18 16:44:35.898010 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.897971 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wcssc" event={"ID":"705f7aec-1965-4cf4-8f5d-218a3bc94e1a","Type":"ContainerStarted","Data":"ba78ef4aa9f560532148c999bcb613d578b0b462295f0f5ed894321d62f67884"} Mar 18 16:44:35.899086 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.899057 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jndjx" event={"ID":"2f0c6c9c-5cbc-4d92-9fee-ad30b7cb88a6","Type":"ContainerStarted","Data":"488f415b6acc13e7b9f3810fd3986cd34cbc423654eed85821ef168213ecdaf3"} Mar 18 16:44:35.900150 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.900124 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2rst4" event={"ID":"b2f54d7b-6035-42a3-98a8-3ab8e13ee1ab","Type":"ContainerStarted","Data":"85adda56b6c2138cd5f315baafe7a48ae5ca18cb8bd7143827766f9527f2419a"} Mar 18 16:44:35.901312 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.901294 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8dd92" event={"ID":"ef1bc9f1-f14a-4cb6-8631-d7ab65c971c7","Type":"ContainerStarted","Data":"338cee037851f6bc8c6254819c7740cb83a7494ad580bfc5370b02a18eb21a92"} Mar 18 16:44:35.902362 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.902339 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" event={"ID":"02c83393-6d45-45b3-bfed-bccf6afbfe33","Type":"ContainerStarted","Data":"580eef8cf4df71be1ed6fae5eec0c6af061d8287b8593abf2153af1c04dd2942"} Mar 18 16:44:35.903420 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.903402 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fxsfb" event={"ID":"eb98c083-6ae5-4745-a6be-ff841741f1f6","Type":"ContainerStarted","Data":"3eaef26b65a85dad0224a55506b838330edc0357f7b372f3899f48addafda6ac"} Mar 18 16:44:35.909464 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:35.909417 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-99.ec2.internal" podStartSLOduration=1.909404131 podStartE2EDuration="1.909404131s" podCreationTimestamp="2026-03-18 16:44:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:44:35.908905432 +0000 UTC m=+3.624328413" watchObservedRunningTime="2026-03-18 16:44:35.909404131 +0000 UTC m=+3.624827129" Mar 18 16:44:36.372592 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:36.372500 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs\") pod \"network-metrics-daemon-wtbdl\" (UID: \"52f2a3f3-56d7-41f7-8bed-9e7229d96408\") " pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:36.372760 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:36.372657 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:36.372760 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:36.372722 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs podName:52f2a3f3-56d7-41f7-8bed-9e7229d96408 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:38.372703434 +0000 UTC m=+6.088126416 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs") pod "network-metrics-daemon-wtbdl" (UID: "52f2a3f3-56d7-41f7-8bed-9e7229d96408") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:36.574224 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:36.574171 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbbxq\" (UniqueName: \"kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq\") pod \"network-check-target-9smhv\" (UID: \"8b451861-a208-430f-840a-bce654bef71f\") " pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:44:36.574408 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:36.574377 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:36.574408 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:36.574395 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:36.574408 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:36.574408 2573 projected.go:194] Error preparing data for projected volume kube-api-access-hbbxq for pod openshift-network-diagnostics/network-check-target-9smhv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:36.574573 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:36.574467 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq podName:8b451861-a208-430f-840a-bce654bef71f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:38.574448579 +0000 UTC m=+6.289871572 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-hbbxq" (UniqueName: "kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq") pod "network-check-target-9smhv" (UID: "8b451861-a208-430f-840a-bce654bef71f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:36.885142 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:36.885106 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:44:36.885695 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:36.885246 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9smhv" podUID="8b451861-a208-430f-840a-bce654bef71f" Mar 18 16:44:36.885695 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:36.885301 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:36.885695 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:36.885475 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtbdl" podUID="52f2a3f3-56d7-41f7-8bed-9e7229d96408" Mar 18 16:44:36.918307 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:36.918263 2573 generic.go:358] "Generic (PLEG): container finished" podID="760d2cc06861e4c286d713e86b836b9f" containerID="b55eef6dfe13b28808c352bfa03cbc8bf1f744c90498212863f85193b3df2ad2" exitCode=0 Mar 18 16:44:36.918901 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:36.918868 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-99.ec2.internal" event={"ID":"760d2cc06861e4c286d713e86b836b9f","Type":"ContainerDied","Data":"b55eef6dfe13b28808c352bfa03cbc8bf1f744c90498212863f85193b3df2ad2"} Mar 18 16:44:37.933737 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:37.933699 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-99.ec2.internal" event={"ID":"760d2cc06861e4c286d713e86b836b9f","Type":"ContainerStarted","Data":"06931fbb76e4d8fbf83a4bdbe0653cd03017ecca6f887cb01ad49ae09bd600c6"} Mar 18 16:44:38.047520 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:38.047458 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-99.ec2.internal" podStartSLOduration=4.047435899 podStartE2EDuration="4.047435899s" podCreationTimestamp="2026-03-18 16:44:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:44:37.956186925 +0000 UTC m=+5.671609927" watchObservedRunningTime="2026-03-18 16:44:38.047435899 +0000 UTC m=+5.762858898" Mar 18 16:44:38.047852 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:38.047831 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-xkzjh"] Mar 18 16:44:38.049858 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:38.049833 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:38.049995 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:38.049921 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkzjh" podUID="ac3a60cb-dba0-4585-a37d-0402db777ed0" Mar 18 16:44:38.087025 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:38.086985 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ac3a60cb-dba0-4585-a37d-0402db777ed0-kubelet-config\") pod \"global-pull-secret-syncer-xkzjh\" (UID: \"ac3a60cb-dba0-4585-a37d-0402db777ed0\") " pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:38.087290 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:38.087037 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ac3a60cb-dba0-4585-a37d-0402db777ed0-dbus\") pod \"global-pull-secret-syncer-xkzjh\" (UID: \"ac3a60cb-dba0-4585-a37d-0402db777ed0\") " pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:38.087290 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:38.087068 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ac3a60cb-dba0-4585-a37d-0402db777ed0-original-pull-secret\") pod \"global-pull-secret-syncer-xkzjh\" (UID: \"ac3a60cb-dba0-4585-a37d-0402db777ed0\") " pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:38.188234 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:38.187715 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ac3a60cb-dba0-4585-a37d-0402db777ed0-original-pull-secret\") pod \"global-pull-secret-syncer-xkzjh\" (UID: \"ac3a60cb-dba0-4585-a37d-0402db777ed0\") " pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:38.188234 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:38.187839 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ac3a60cb-dba0-4585-a37d-0402db777ed0-kubelet-config\") pod \"global-pull-secret-syncer-xkzjh\" (UID: \"ac3a60cb-dba0-4585-a37d-0402db777ed0\") " pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:38.188234 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:38.187871 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ac3a60cb-dba0-4585-a37d-0402db777ed0-dbus\") pod \"global-pull-secret-syncer-xkzjh\" (UID: \"ac3a60cb-dba0-4585-a37d-0402db777ed0\") " pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:38.188234 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:38.188071 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ac3a60cb-dba0-4585-a37d-0402db777ed0-dbus\") pod \"global-pull-secret-syncer-xkzjh\" (UID: \"ac3a60cb-dba0-4585-a37d-0402db777ed0\") " pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:38.188234 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:38.188181 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:38.188234 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:38.188238 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac3a60cb-dba0-4585-a37d-0402db777ed0-original-pull-secret podName:ac3a60cb-dba0-4585-a37d-0402db777ed0 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:38.688219141 +0000 UTC m=+6.403642134 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ac3a60cb-dba0-4585-a37d-0402db777ed0-original-pull-secret") pod "global-pull-secret-syncer-xkzjh" (UID: "ac3a60cb-dba0-4585-a37d-0402db777ed0") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:38.188658 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:38.188489 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ac3a60cb-dba0-4585-a37d-0402db777ed0-kubelet-config\") pod \"global-pull-secret-syncer-xkzjh\" (UID: \"ac3a60cb-dba0-4585-a37d-0402db777ed0\") " pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:38.389860 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:38.389269 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs\") pod \"network-metrics-daemon-wtbdl\" (UID: \"52f2a3f3-56d7-41f7-8bed-9e7229d96408\") " pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:38.389860 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:38.389430 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:38.389860 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:38.389498 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs podName:52f2a3f3-56d7-41f7-8bed-9e7229d96408 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:42.389478909 +0000 UTC m=+10.104901895 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs") pod "network-metrics-daemon-wtbdl" (UID: "52f2a3f3-56d7-41f7-8bed-9e7229d96408") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:38.590862 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:38.590774 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbbxq\" (UniqueName: \"kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq\") pod \"network-check-target-9smhv\" (UID: \"8b451861-a208-430f-840a-bce654bef71f\") " pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:44:38.591054 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:38.590999 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:38.591054 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:38.591017 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:38.591054 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:38.591031 2573 projected.go:194] Error preparing data for projected volume kube-api-access-hbbxq for pod openshift-network-diagnostics/network-check-target-9smhv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:38.591199 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:38.591089 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq podName:8b451861-a208-430f-840a-bce654bef71f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:42.591071002 +0000 UTC m=+10.306493985 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-hbbxq" (UniqueName: "kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq") pod "network-check-target-9smhv" (UID: "8b451861-a208-430f-840a-bce654bef71f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:38.691230 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:38.691188 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ac3a60cb-dba0-4585-a37d-0402db777ed0-original-pull-secret\") pod \"global-pull-secret-syncer-xkzjh\" (UID: \"ac3a60cb-dba0-4585-a37d-0402db777ed0\") " pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:38.691480 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:38.691448 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:38.691594 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:38.691530 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac3a60cb-dba0-4585-a37d-0402db777ed0-original-pull-secret podName:ac3a60cb-dba0-4585-a37d-0402db777ed0 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:39.691510202 +0000 UTC m=+7.406933200 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ac3a60cb-dba0-4585-a37d-0402db777ed0-original-pull-secret") pod "global-pull-secret-syncer-xkzjh" (UID: "ac3a60cb-dba0-4585-a37d-0402db777ed0") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:38.884139 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:38.884056 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:44:38.884305 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:38.884190 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9smhv" podUID="8b451861-a208-430f-840a-bce654bef71f" Mar 18 16:44:38.886728 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:38.886557 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:38.886728 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:38.886676 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtbdl" podUID="52f2a3f3-56d7-41f7-8bed-9e7229d96408" Mar 18 16:44:39.700698 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:39.700106 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ac3a60cb-dba0-4585-a37d-0402db777ed0-original-pull-secret\") pod \"global-pull-secret-syncer-xkzjh\" (UID: \"ac3a60cb-dba0-4585-a37d-0402db777ed0\") " pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:39.700698 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:39.700268 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:39.700698 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:39.700330 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac3a60cb-dba0-4585-a37d-0402db777ed0-original-pull-secret podName:ac3a60cb-dba0-4585-a37d-0402db777ed0 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:41.700311888 +0000 UTC m=+9.415734882 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ac3a60cb-dba0-4585-a37d-0402db777ed0-original-pull-secret") pod "global-pull-secret-syncer-xkzjh" (UID: "ac3a60cb-dba0-4585-a37d-0402db777ed0") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:39.884159 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:39.884113 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:39.884322 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:39.884261 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkzjh" podUID="ac3a60cb-dba0-4585-a37d-0402db777ed0" Mar 18 16:44:40.884283 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:40.884248 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:44:40.884730 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:40.884385 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9smhv" podUID="8b451861-a208-430f-840a-bce654bef71f" Mar 18 16:44:40.884807 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:40.884248 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:40.884865 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:40.884837 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtbdl" podUID="52f2a3f3-56d7-41f7-8bed-9e7229d96408" Mar 18 16:44:41.718746 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:41.718707 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ac3a60cb-dba0-4585-a37d-0402db777ed0-original-pull-secret\") pod \"global-pull-secret-syncer-xkzjh\" (UID: \"ac3a60cb-dba0-4585-a37d-0402db777ed0\") " pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:41.718973 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:41.718885 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:41.718973 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:41.718958 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac3a60cb-dba0-4585-a37d-0402db777ed0-original-pull-secret podName:ac3a60cb-dba0-4585-a37d-0402db777ed0 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:45.718928956 +0000 UTC m=+13.434351932 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ac3a60cb-dba0-4585-a37d-0402db777ed0-original-pull-secret") pod "global-pull-secret-syncer-xkzjh" (UID: "ac3a60cb-dba0-4585-a37d-0402db777ed0") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:41.883683 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:41.883636 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:41.883879 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:41.883777 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkzjh" podUID="ac3a60cb-dba0-4585-a37d-0402db777ed0" Mar 18 16:44:42.424440 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:42.424399 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs\") pod \"network-metrics-daemon-wtbdl\" (UID: \"52f2a3f3-56d7-41f7-8bed-9e7229d96408\") " pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:42.424908 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:42.424557 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:42.424908 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:42.424646 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs podName:52f2a3f3-56d7-41f7-8bed-9e7229d96408 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:50.424622807 +0000 UTC m=+18.140045786 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs") pod "network-metrics-daemon-wtbdl" (UID: "52f2a3f3-56d7-41f7-8bed-9e7229d96408") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:42.626724 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:42.626104 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbbxq\" (UniqueName: \"kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq\") pod \"network-check-target-9smhv\" (UID: \"8b451861-a208-430f-840a-bce654bef71f\") " pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:44:42.626724 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:42.626267 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:42.626724 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:42.626291 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:42.626724 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:42.626304 2573 projected.go:194] Error preparing data for projected volume kube-api-access-hbbxq for pod openshift-network-diagnostics/network-check-target-9smhv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:42.626724 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:42.626361 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq podName:8b451861-a208-430f-840a-bce654bef71f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:50.626341711 +0000 UTC m=+18.341764695 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-hbbxq" (UniqueName: "kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq") pod "network-check-target-9smhv" (UID: "8b451861-a208-430f-840a-bce654bef71f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:42.884460 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:42.884376 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:44:42.884606 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:42.884497 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9smhv" podUID="8b451861-a208-430f-840a-bce654bef71f" Mar 18 16:44:42.884926 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:42.884905 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:42.885074 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:42.885041 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtbdl" podUID="52f2a3f3-56d7-41f7-8bed-9e7229d96408" Mar 18 16:44:43.883837 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:43.883803 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:43.884302 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:43.883954 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkzjh" podUID="ac3a60cb-dba0-4585-a37d-0402db777ed0" Mar 18 16:44:44.890102 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:44.890068 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:44.890514 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:44.890068 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:44:44.890514 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:44.890207 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtbdl" podUID="52f2a3f3-56d7-41f7-8bed-9e7229d96408" Mar 18 16:44:44.890514 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:44.890246 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9smhv" podUID="8b451861-a208-430f-840a-bce654bef71f" Mar 18 16:44:45.752192 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:45.752148 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ac3a60cb-dba0-4585-a37d-0402db777ed0-original-pull-secret\") pod \"global-pull-secret-syncer-xkzjh\" (UID: \"ac3a60cb-dba0-4585-a37d-0402db777ed0\") " pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:45.752360 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:45.752304 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:45.752414 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:45.752384 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac3a60cb-dba0-4585-a37d-0402db777ed0-original-pull-secret podName:ac3a60cb-dba0-4585-a37d-0402db777ed0 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:53.752361659 +0000 UTC m=+21.467784640 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ac3a60cb-dba0-4585-a37d-0402db777ed0-original-pull-secret") pod "global-pull-secret-syncer-xkzjh" (UID: "ac3a60cb-dba0-4585-a37d-0402db777ed0") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:45.883886 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:45.883853 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:45.884084 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:45.883966 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkzjh" podUID="ac3a60cb-dba0-4585-a37d-0402db777ed0" Mar 18 16:44:46.884160 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:46.884120 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:44:46.884629 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:46.884130 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:46.884629 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:46.884263 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9smhv" podUID="8b451861-a208-430f-840a-bce654bef71f" Mar 18 16:44:46.884629 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:46.884329 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtbdl" podUID="52f2a3f3-56d7-41f7-8bed-9e7229d96408" Mar 18 16:44:47.883723 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:47.883685 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:47.883894 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:47.883813 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkzjh" podUID="ac3a60cb-dba0-4585-a37d-0402db777ed0" Mar 18 16:44:48.886544 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:48.886513 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:44:48.887051 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:48.886577 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:48.887051 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:48.886688 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtbdl" podUID="52f2a3f3-56d7-41f7-8bed-9e7229d96408" Mar 18 16:44:48.887051 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:48.886772 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9smhv" podUID="8b451861-a208-430f-840a-bce654bef71f" Mar 18 16:44:49.883670 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:49.883629 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:49.883861 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:49.883746 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkzjh" podUID="ac3a60cb-dba0-4585-a37d-0402db777ed0" Mar 18 16:44:50.486541 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:50.486499 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs\") pod \"network-metrics-daemon-wtbdl\" (UID: \"52f2a3f3-56d7-41f7-8bed-9e7229d96408\") " pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:50.487017 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:50.486677 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:50.487017 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:50.486765 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs podName:52f2a3f3-56d7-41f7-8bed-9e7229d96408 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:06.4867447 +0000 UTC m=+34.202167691 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs") pod "network-metrics-daemon-wtbdl" (UID: "52f2a3f3-56d7-41f7-8bed-9e7229d96408") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:50.687670 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:50.687633 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbbxq\" (UniqueName: \"kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq\") pod \"network-check-target-9smhv\" (UID: \"8b451861-a208-430f-840a-bce654bef71f\") " pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:44:50.687970 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:50.687777 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:50.687970 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:50.687791 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:50.687970 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:50.687800 2573 projected.go:194] Error preparing data for projected volume kube-api-access-hbbxq for pod openshift-network-diagnostics/network-check-target-9smhv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:50.687970 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:50.687848 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq podName:8b451861-a208-430f-840a-bce654bef71f nodeName:}" failed. No retries permitted until 2026-03-18 16:45:06.687836143 +0000 UTC m=+34.403259119 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-hbbxq" (UniqueName: "kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq") pod "network-check-target-9smhv" (UID: "8b451861-a208-430f-840a-bce654bef71f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:50.884323 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:50.884230 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:44:50.884469 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:50.884373 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9smhv" podUID="8b451861-a208-430f-840a-bce654bef71f" Mar 18 16:44:50.884469 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:50.884412 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:50.884576 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:50.884504 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtbdl" podUID="52f2a3f3-56d7-41f7-8bed-9e7229d96408" Mar 18 16:44:51.883853 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:51.883813 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:51.884353 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:51.883916 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkzjh" podUID="ac3a60cb-dba0-4585-a37d-0402db777ed0" Mar 18 16:44:52.884961 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:52.884508 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:44:52.885634 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:52.885056 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9smhv" podUID="8b451861-a208-430f-840a-bce654bef71f" Mar 18 16:44:52.885634 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:52.884577 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:52.885634 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:52.885234 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtbdl" podUID="52f2a3f3-56d7-41f7-8bed-9e7229d96408" Mar 18 16:44:52.961201 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:52.961160 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8dd92" event={"ID":"ef1bc9f1-f14a-4cb6-8631-d7ab65c971c7","Type":"ContainerStarted","Data":"9f8d8a84d401ce77aa234d38833f1d289027a36449c50acb083d5f8708f55f71"} Mar 18 16:44:52.962892 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:52.962836 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" event={"ID":"02c83393-6d45-45b3-bfed-bccf6afbfe33","Type":"ContainerStarted","Data":"c52564e74dcc478eb0272bfa39bc893652ce6ac3e9cae0a3cdd99f52e171030c"} Mar 18 16:44:52.964517 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:52.964487 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fxsfb" event={"ID":"eb98c083-6ae5-4745-a6be-ff841741f1f6","Type":"ContainerStarted","Data":"7daa7ee40978f2a9609c43311e98ee229ffca646d5e4b7312be202bb9a7f915f"} Mar 18 16:44:52.968377 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:52.968342 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qzdnc_c6018dac-2f72-43d9-b554-dffe8cf976c4/ovn-acl-logging/0.log" Mar 18 16:44:52.968700 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:52.968673 2573 generic.go:358] "Generic (PLEG): container finished" podID="c6018dac-2f72-43d9-b554-dffe8cf976c4" containerID="af5c08c1c6d2b0b7f1abcf4a4fbda1581a9c11e8638a76ec0d489359bf04b024" exitCode=1 Mar 18 16:44:52.968813 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:52.968741 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" event={"ID":"c6018dac-2f72-43d9-b554-dffe8cf976c4","Type":"ContainerStarted","Data":"ba418373462543f1d8a6a8d840550d778c7eef1eea08c881cfc015a4aef572bd"} Mar 18 16:44:52.968813 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:52.968774 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" event={"ID":"c6018dac-2f72-43d9-b554-dffe8cf976c4","Type":"ContainerStarted","Data":"6cd184b1c1055a3353a48aef77c9edbb603333013e5893b791401b73f6cfc904"} Mar 18 16:44:52.968813 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:52.968790 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" event={"ID":"c6018dac-2f72-43d9-b554-dffe8cf976c4","Type":"ContainerStarted","Data":"4c560866efe1768aa6a9717f26f92699836fbe96e6f16a632e572ce47c108321"} Mar 18 16:44:52.968813 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:52.968804 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" event={"ID":"c6018dac-2f72-43d9-b554-dffe8cf976c4","Type":"ContainerStarted","Data":"fb768fdd838be75fdf63434361e7c7158edcde7f27415555494263debcb5b994"} Mar 18 16:44:52.969056 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:52.968818 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" event={"ID":"c6018dac-2f72-43d9-b554-dffe8cf976c4","Type":"ContainerDied","Data":"af5c08c1c6d2b0b7f1abcf4a4fbda1581a9c11e8638a76ec0d489359bf04b024"} Mar 18 16:44:52.969056 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:52.968833 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" event={"ID":"c6018dac-2f72-43d9-b554-dffe8cf976c4","Type":"ContainerStarted","Data":"f3860d6dc52be8b0a4942902a48416541aa3c18fea819b9df0ae2b607eab3cd4"} Mar 18 16:44:52.970326 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:52.970308 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zdh9l" event={"ID":"6bd0541f-f19e-4cdf-b03e-55eabcf75d7e","Type":"ContainerStarted","Data":"5bb12de2b31738555fa7c81926cc7db6362d5fea89350ccf7ccfac18cf5b30e7"} Mar 18 16:44:52.971879 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:52.971851 2573 generic.go:358] "Generic (PLEG): container finished" podID="7b009b7f-519d-4077-b100-93c7b9934af9" containerID="0f719289071520eb5af3df2145016282aa16f4f266a859f3482ac49665dd7710" exitCode=0 Mar 18 16:44:52.972006 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:52.971928 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4npwk" event={"ID":"7b009b7f-519d-4077-b100-93c7b9934af9","Type":"ContainerDied","Data":"0f719289071520eb5af3df2145016282aa16f4f266a859f3482ac49665dd7710"} Mar 18 16:44:52.973642 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:52.973618 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wcssc" event={"ID":"705f7aec-1965-4cf4-8f5d-218a3bc94e1a","Type":"ContainerStarted","Data":"5425703ad36b432afd942399c022da793e427385a6b48bd67a502edb27e4c6f7"} Mar 18 16:44:52.975335 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:52.975313 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jndjx" event={"ID":"2f0c6c9c-5cbc-4d92-9fee-ad30b7cb88a6","Type":"ContainerStarted","Data":"0689e5361cbd3c98fcff561f55a7ba90b0a73710b592b7f94bc83a46fe186b45"} Mar 18 16:44:52.976044 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:52.975996 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8dd92" podStartSLOduration=3.380360676 podStartE2EDuration="19.975981974s" podCreationTimestamp="2026-03-18 16:44:33 +0000 UTC" firstStartedPulling="2026-03-18 16:44:35.539751068 +0000 UTC m=+3.255174058" lastFinishedPulling="2026-03-18 16:44:52.135372379 +0000 UTC m=+19.850795356" observedRunningTime="2026-03-18 16:44:52.975717187 +0000 UTC m=+20.691140180" watchObservedRunningTime="2026-03-18 16:44:52.975981974 +0000 UTC m=+20.691404970" Mar 18 16:44:52.987355 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:52.987314 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fxsfb" podStartSLOduration=4.384093142 podStartE2EDuration="20.987301273s" podCreationTimestamp="2026-03-18 16:44:32 +0000 UTC" firstStartedPulling="2026-03-18 16:44:35.532151878 +0000 UTC m=+3.247574868" lastFinishedPulling="2026-03-18 16:44:52.135360012 +0000 UTC m=+19.850782999" observedRunningTime="2026-03-18 16:44:52.986985692 +0000 UTC m=+20.702408688" watchObservedRunningTime="2026-03-18 16:44:52.987301273 +0000 UTC m=+20.702724271" Mar 18 16:44:53.015689 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:53.015642 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-jndjx" podStartSLOduration=3.414629558 podStartE2EDuration="20.015627645s" podCreationTimestamp="2026-03-18 16:44:33 +0000 UTC" firstStartedPulling="2026-03-18 16:44:35.534390145 +0000 UTC m=+3.249813135" lastFinishedPulling="2026-03-18 16:44:52.135388231 +0000 UTC m=+19.850811222" observedRunningTime="2026-03-18 16:44:52.998970976 +0000 UTC m=+20.714393971" watchObservedRunningTime="2026-03-18 16:44:53.015627645 +0000 UTC m=+20.731050642" Mar 18 16:44:53.031030 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:53.030979 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zdh9l" podStartSLOduration=3.382068254 podStartE2EDuration="20.030961902s" podCreationTimestamp="2026-03-18 16:44:33 +0000 UTC" firstStartedPulling="2026-03-18 16:44:35.526693118 +0000 UTC m=+3.242116095" lastFinishedPulling="2026-03-18 16:44:52.175586764 +0000 UTC m=+19.891009743" observedRunningTime="2026-03-18 16:44:53.030519561 +0000 UTC m=+20.745942559" watchObservedRunningTime="2026-03-18 16:44:53.030961902 +0000 UTC m=+20.746384897" Mar 18 16:44:53.046290 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:53.046237 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-wcssc" podStartSLOduration=3.447182191 podStartE2EDuration="20.046220802s" podCreationTimestamp="2026-03-18 16:44:33 +0000 UTC" firstStartedPulling="2026-03-18 16:44:35.536364587 +0000 UTC m=+3.251787577" lastFinishedPulling="2026-03-18 16:44:52.135403212 +0000 UTC m=+19.850826188" observedRunningTime="2026-03-18 16:44:53.045862071 +0000 UTC m=+20.761285071" watchObservedRunningTime="2026-03-18 16:44:53.046220802 +0000 UTC m=+20.761643799" Mar 18 16:44:53.766197 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:53.766162 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Mar 18 16:44:53.811909 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:53.811875 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ac3a60cb-dba0-4585-a37d-0402db777ed0-original-pull-secret\") pod \"global-pull-secret-syncer-xkzjh\" (UID: \"ac3a60cb-dba0-4585-a37d-0402db777ed0\") " pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:53.812126 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:53.812032 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:53.812126 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:53.812085 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac3a60cb-dba0-4585-a37d-0402db777ed0-original-pull-secret podName:ac3a60cb-dba0-4585-a37d-0402db777ed0 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:09.812067232 +0000 UTC m=+37.527490209 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ac3a60cb-dba0-4585-a37d-0402db777ed0-original-pull-secret") pod "global-pull-secret-syncer-xkzjh" (UID: "ac3a60cb-dba0-4585-a37d-0402db777ed0") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:53.823096 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:53.822920 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-03-18T16:44:53.766187481Z","UUID":"c0e2f918-7b19-4169-92ec-abc7ec81fbcf","Handler":null,"Name":"","Endpoint":""} Mar 18 16:44:53.824853 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:53.824832 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Mar 18 16:44:53.825038 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:53.824860 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Mar 18 16:44:53.883598 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:53.883565 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:53.883826 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:53.883673 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkzjh" podUID="ac3a60cb-dba0-4585-a37d-0402db777ed0" Mar 18 16:44:53.979510 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:53.979475 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2rst4" event={"ID":"b2f54d7b-6035-42a3-98a8-3ab8e13ee1ab","Type":"ContainerStarted","Data":"5bd43561e416d008016b24a5c688685817c0835fe2ae35467ef4b0c200aaec01"} Mar 18 16:44:53.981528 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:53.981497 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" event={"ID":"02c83393-6d45-45b3-bfed-bccf6afbfe33","Type":"ContainerStarted","Data":"596ce5cc3273330cd6291ff5eb2dc422356a99dec644ceecae30f94ae784bf3c"} Mar 18 16:44:53.994604 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:53.994545 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2rst4" podStartSLOduration=4.392019031 podStartE2EDuration="20.994528391s" podCreationTimestamp="2026-03-18 16:44:33 +0000 UTC" firstStartedPulling="2026-03-18 16:44:35.533180405 +0000 UTC m=+3.248603390" lastFinishedPulling="2026-03-18 16:44:52.13568976 +0000 UTC m=+19.851112750" observedRunningTime="2026-03-18 16:44:53.994330397 +0000 UTC m=+21.709753397" watchObservedRunningTime="2026-03-18 16:44:53.994528391 +0000 UTC m=+21.709951389" Mar 18 16:44:54.884011 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:54.883969 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:44:54.884211 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:54.884010 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:54.884211 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:54.884106 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9smhv" podUID="8b451861-a208-430f-840a-bce654bef71f" Mar 18 16:44:54.884211 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:54.884189 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtbdl" podUID="52f2a3f3-56d7-41f7-8bed-9e7229d96408" Mar 18 16:44:55.265421 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:55.265355 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-jndjx" Mar 18 16:44:55.266397 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:55.266370 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-jndjx" Mar 18 16:44:55.883925 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:55.883881 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:55.884092 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:55.884005 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkzjh" podUID="ac3a60cb-dba0-4585-a37d-0402db777ed0" Mar 18 16:44:55.988064 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:55.988031 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" event={"ID":"02c83393-6d45-45b3-bfed-bccf6afbfe33","Type":"ContainerStarted","Data":"49347d86dfa010386bfd00480b121a402a4650d15585a4eec5048c55ecef4b82"} Mar 18 16:44:55.991362 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:55.991336 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qzdnc_c6018dac-2f72-43d9-b554-dffe8cf976c4/ovn-acl-logging/0.log" Mar 18 16:44:55.991912 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:55.991882 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" event={"ID":"c6018dac-2f72-43d9-b554-dffe8cf976c4","Type":"ContainerStarted","Data":"8dfb465568e53caf91ff1c37a11ea4de70a9a5e5803bfd4bc77eecc996c8516a"} Mar 18 16:44:56.009579 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:56.009529 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8h4n7" podStartSLOduration=4.572845766 podStartE2EDuration="24.009509028s" podCreationTimestamp="2026-03-18 16:44:32 +0000 UTC" firstStartedPulling="2026-03-18 16:44:35.538626356 +0000 UTC m=+3.254049349" lastFinishedPulling="2026-03-18 16:44:54.97528962 +0000 UTC m=+22.690712611" observedRunningTime="2026-03-18 16:44:56.009106119 +0000 UTC m=+23.724529129" watchObservedRunningTime="2026-03-18 16:44:56.009509028 +0000 UTC m=+23.724932027" Mar 18 16:44:56.884384 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:56.884195 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:44:56.884810 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:56.884195 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:56.884810 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:56.884473 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9smhv" podUID="8b451861-a208-430f-840a-bce654bef71f" Mar 18 16:44:56.884810 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:56.884551 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtbdl" podUID="52f2a3f3-56d7-41f7-8bed-9e7229d96408" Mar 18 16:44:57.883741 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:57.883510 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:57.883873 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:57.883763 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkzjh" podUID="ac3a60cb-dba0-4585-a37d-0402db777ed0" Mar 18 16:44:57.998590 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:57.998564 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qzdnc_c6018dac-2f72-43d9-b554-dffe8cf976c4/ovn-acl-logging/0.log" Mar 18 16:44:57.999443 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:57.998933 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" event={"ID":"c6018dac-2f72-43d9-b554-dffe8cf976c4","Type":"ContainerStarted","Data":"0f3ee0523fd3c07e198ee4ffe53dc08bf5d16bb4703cc5c7c2870866b3899799"} Mar 18 16:44:57.999443 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:57.999391 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:57.999443 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:57.999420 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:57.999578 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:57.999555 2573 scope.go:117] "RemoveContainer" containerID="af5c08c1c6d2b0b7f1abcf4a4fbda1581a9c11e8638a76ec0d489359bf04b024" Mar 18 16:44:58.000630 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:58.000607 2573 generic.go:358] "Generic (PLEG): container finished" podID="7b009b7f-519d-4077-b100-93c7b9934af9" containerID="5592746a087b266ca5d668c1c2229ecb6583664244aedef7b6d51b5d736cb1cf" exitCode=0 Mar 18 16:44:58.000741 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:58.000647 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4npwk" event={"ID":"7b009b7f-519d-4077-b100-93c7b9934af9","Type":"ContainerDied","Data":"5592746a087b266ca5d668c1c2229ecb6583664244aedef7b6d51b5d736cb1cf"} Mar 18 16:44:58.017403 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:58.017379 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:58.325869 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:58.325823 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-jndjx" Mar 18 16:44:58.326077 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:58.326004 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 16:44:58.326518 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:58.326495 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-jndjx" Mar 18 16:44:58.883973 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:58.883919 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:58.884173 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:58.883976 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:44:58.884173 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:58.884096 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtbdl" podUID="52f2a3f3-56d7-41f7-8bed-9e7229d96408" Mar 18 16:44:58.884296 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:58.884271 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9smhv" podUID="8b451861-a208-430f-840a-bce654bef71f" Mar 18 16:44:59.006070 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:59.006048 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qzdnc_c6018dac-2f72-43d9-b554-dffe8cf976c4/ovn-acl-logging/0.log" Mar 18 16:44:59.006479 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:59.006440 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" event={"ID":"c6018dac-2f72-43d9-b554-dffe8cf976c4","Type":"ContainerStarted","Data":"ef553e0ec3fe2fe1d623a7bb0ed1a2995aaa31093098301fd00534d45a981c9a"} Mar 18 16:44:59.006818 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:59.006797 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:59.021015 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:59.020993 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:44:59.040662 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:59.040615 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" podStartSLOduration=9.195074838 podStartE2EDuration="26.0405969s" podCreationTimestamp="2026-03-18 16:44:33 +0000 UTC" firstStartedPulling="2026-03-18 16:44:35.529870715 +0000 UTC m=+3.245293690" lastFinishedPulling="2026-03-18 16:44:52.375392771 +0000 UTC m=+20.090815752" observedRunningTime="2026-03-18 16:44:59.039291462 +0000 UTC m=+26.754714463" watchObservedRunningTime="2026-03-18 16:44:59.0405969 +0000 UTC m=+26.756019898" Mar 18 16:44:59.327730 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:59.327694 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xkzjh"] Mar 18 16:44:59.327907 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:59.327825 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:44:59.327987 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:59.327912 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkzjh" podUID="ac3a60cb-dba0-4585-a37d-0402db777ed0" Mar 18 16:44:59.330725 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:59.330683 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9smhv"] Mar 18 16:44:59.330837 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:59.330806 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:44:59.330953 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:59.330918 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9smhv" podUID="8b451861-a208-430f-840a-bce654bef71f" Mar 18 16:44:59.331300 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:59.331278 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wtbdl"] Mar 18 16:44:59.331389 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:44:59.331368 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:44:59.331458 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:44:59.331443 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtbdl" podUID="52f2a3f3-56d7-41f7-8bed-9e7229d96408" Mar 18 16:45:00.010756 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:00.010725 2573 generic.go:358] "Generic (PLEG): container finished" podID="7b009b7f-519d-4077-b100-93c7b9934af9" containerID="80a61fcc415036847884bb483e58230afce1d2f5d02d0ba47b20e92a1e19895c" exitCode=0 Mar 18 16:45:00.011141 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:00.010802 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4npwk" event={"ID":"7b009b7f-519d-4077-b100-93c7b9934af9","Type":"ContainerDied","Data":"80a61fcc415036847884bb483e58230afce1d2f5d02d0ba47b20e92a1e19895c"} Mar 18 16:45:00.884204 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:00.884165 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:45:00.884499 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:00.884273 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9smhv" podUID="8b451861-a208-430f-840a-bce654bef71f" Mar 18 16:45:00.884499 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:00.884284 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:45:00.884499 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:00.884302 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:45:00.884499 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:00.884407 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtbdl" podUID="52f2a3f3-56d7-41f7-8bed-9e7229d96408" Mar 18 16:45:00.884499 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:00.884474 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkzjh" podUID="ac3a60cb-dba0-4585-a37d-0402db777ed0" Mar 18 16:45:02.020346 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:02.020308 2573 generic.go:358] "Generic (PLEG): container finished" podID="7b009b7f-519d-4077-b100-93c7b9934af9" containerID="38311c7d4f3329ddb6d19a4302b6a46dcb00f95c93e01507845d555a4088fa30" exitCode=0 Mar 18 16:45:02.020816 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:02.020369 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4npwk" event={"ID":"7b009b7f-519d-4077-b100-93c7b9934af9","Type":"ContainerDied","Data":"38311c7d4f3329ddb6d19a4302b6a46dcb00f95c93e01507845d555a4088fa30"} Mar 18 16:45:02.884757 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:02.884713 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:45:02.884953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:02.884800 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:45:02.884953 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:02.884837 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:45:02.885078 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:02.885043 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtbdl" podUID="52f2a3f3-56d7-41f7-8bed-9e7229d96408" Mar 18 16:45:02.885212 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:02.885102 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9smhv" podUID="8b451861-a208-430f-840a-bce654bef71f" Mar 18 16:45:02.885212 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:02.885153 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkzjh" podUID="ac3a60cb-dba0-4585-a37d-0402db777ed0" Mar 18 16:45:04.883424 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:04.883241 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:45:04.883424 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:04.883249 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:45:04.883424 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:04.883379 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkzjh" podUID="ac3a60cb-dba0-4585-a37d-0402db777ed0" Mar 18 16:45:04.884047 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:04.883464 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtbdl" podUID="52f2a3f3-56d7-41f7-8bed-9e7229d96408" Mar 18 16:45:04.884047 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:04.883523 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:45:04.884047 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:04.883584 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9smhv" podUID="8b451861-a208-430f-840a-bce654bef71f" Mar 18 16:45:05.087980 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.087751 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-99.ec2.internal" event="NodeReady" Mar 18 16:45:05.088146 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.088097 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Mar 18 16:45:05.126270 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.126236 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7776559f4-s949p"] Mar 18 16:45:05.151524 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.151440 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7776559f4-s949p"] Mar 18 16:45:05.151524 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.151472 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7zqf5"] Mar 18 16:45:05.151780 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.151648 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.154189 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.154156 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Mar 18 16:45:05.154623 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.154577 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Mar 18 16:45:05.154623 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.154621 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Mar 18 16:45:05.154859 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.154640 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8k48n\"" Mar 18 16:45:05.159904 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.159818 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Mar 18 16:45:05.166178 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.166153 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-bxl54"] Mar 18 16:45:05.166321 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.166305 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7zqf5" Mar 18 16:45:05.169552 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.169365 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Mar 18 16:45:05.169552 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.169406 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dz9s5\"" Mar 18 16:45:05.169552 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.169434 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Mar 18 16:45:05.169807 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.169493 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Mar 18 16:45:05.190513 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.190480 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7zqf5"] Mar 18 16:45:05.190513 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.190518 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bxl54"] Mar 18 16:45:05.190737 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.190655 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bxl54" Mar 18 16:45:05.194460 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.194231 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Mar 18 16:45:05.194460 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.194323 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wnp77\"" Mar 18 16:45:05.194460 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.194334 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Mar 18 16:45:05.300758 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.300728 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.300973 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.300781 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert\") pod \"ingress-canary-7zqf5\" (UID: \"1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3\") " pod="openshift-ingress-canary/ingress-canary-7zqf5" Mar 18 16:45:05.300973 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.300807 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/52134bb8-4757-4487-a5b9-38bdaea56506-image-registry-private-configuration\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.300973 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.300914 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52134bb8-4757-4487-a5b9-38bdaea56506-trusted-ca\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.300973 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.300968 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/52134bb8-4757-4487-a5b9-38bdaea56506-registry-certificates\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.301138 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.301016 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-bound-sa-token\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.301138 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.301044 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f10649a-4fba-40f6-9e10-02d8301f5e9e-config-volume\") pod \"dns-default-bxl54\" (UID: \"5f10649a-4fba-40f6-9e10-02d8301f5e9e\") " pod="openshift-dns/dns-default-bxl54" Mar 18 16:45:05.301138 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.301070 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/52134bb8-4757-4487-a5b9-38bdaea56506-installation-pull-secrets\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.301138 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.301107 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls\") pod \"dns-default-bxl54\" (UID: \"5f10649a-4fba-40f6-9e10-02d8301f5e9e\") " pod="openshift-dns/dns-default-bxl54" Mar 18 16:45:05.301321 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.301143 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt9ww\" (UniqueName: \"kubernetes.io/projected/5f10649a-4fba-40f6-9e10-02d8301f5e9e-kube-api-access-wt9ww\") pod \"dns-default-bxl54\" (UID: \"5f10649a-4fba-40f6-9e10-02d8301f5e9e\") " pod="openshift-dns/dns-default-bxl54" Mar 18 16:45:05.301321 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.301192 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcqkb\" (UniqueName: \"kubernetes.io/projected/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-kube-api-access-gcqkb\") pod \"ingress-canary-7zqf5\" (UID: \"1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3\") " pod="openshift-ingress-canary/ingress-canary-7zqf5" Mar 18 16:45:05.301321 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.301222 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/52134bb8-4757-4487-a5b9-38bdaea56506-ca-trust-extracted\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.301321 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.301242 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhxzr\" (UniqueName: \"kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-kube-api-access-lhxzr\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.301321 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.301265 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5f10649a-4fba-40f6-9e10-02d8301f5e9e-tmp-dir\") pod \"dns-default-bxl54\" (UID: \"5f10649a-4fba-40f6-9e10-02d8301f5e9e\") " pod="openshift-dns/dns-default-bxl54" Mar 18 16:45:05.402245 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.402152 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.402245 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.402227 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert\") pod \"ingress-canary-7zqf5\" (UID: \"1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3\") " pod="openshift-ingress-canary/ingress-canary-7zqf5" Mar 18 16:45:05.402477 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:05.402331 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:05.402477 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:05.402335 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:45:05.402477 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:05.402360 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7776559f4-s949p: secret "image-registry-tls" not found Mar 18 16:45:05.402477 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.402367 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/52134bb8-4757-4487-a5b9-38bdaea56506-image-registry-private-configuration\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.402477 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:05.402407 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert podName:1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:05.902387671 +0000 UTC m=+33.617810728 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert") pod "ingress-canary-7zqf5" (UID: "1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3") : secret "canary-serving-cert" not found Mar 18 16:45:05.402714 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:05.402491 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls podName:52134bb8-4757-4487-a5b9-38bdaea56506 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:05.902471406 +0000 UTC m=+33.617894397 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls") pod "image-registry-7776559f4-s949p" (UID: "52134bb8-4757-4487-a5b9-38bdaea56506") : secret "image-registry-tls" not found Mar 18 16:45:05.402714 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.402543 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52134bb8-4757-4487-a5b9-38bdaea56506-trusted-ca\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.402714 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.402581 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/52134bb8-4757-4487-a5b9-38bdaea56506-registry-certificates\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.402714 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.402611 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-bound-sa-token\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.402714 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.402644 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f10649a-4fba-40f6-9e10-02d8301f5e9e-config-volume\") pod \"dns-default-bxl54\" (UID: \"5f10649a-4fba-40f6-9e10-02d8301f5e9e\") " pod="openshift-dns/dns-default-bxl54" Mar 18 16:45:05.402922 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.402767 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/52134bb8-4757-4487-a5b9-38bdaea56506-installation-pull-secrets\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.402922 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.402825 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls\") pod \"dns-default-bxl54\" (UID: \"5f10649a-4fba-40f6-9e10-02d8301f5e9e\") " pod="openshift-dns/dns-default-bxl54" Mar 18 16:45:05.402922 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.402856 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wt9ww\" (UniqueName: \"kubernetes.io/projected/5f10649a-4fba-40f6-9e10-02d8301f5e9e-kube-api-access-wt9ww\") pod \"dns-default-bxl54\" (UID: \"5f10649a-4fba-40f6-9e10-02d8301f5e9e\") " pod="openshift-dns/dns-default-bxl54" Mar 18 16:45:05.402922 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.402907 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gcqkb\" (UniqueName: \"kubernetes.io/projected/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-kube-api-access-gcqkb\") pod \"ingress-canary-7zqf5\" (UID: \"1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3\") " pod="openshift-ingress-canary/ingress-canary-7zqf5" Mar 18 16:45:05.403170 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.402958 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/52134bb8-4757-4487-a5b9-38bdaea56506-ca-trust-extracted\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.403170 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.402993 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhxzr\" (UniqueName: \"kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-kube-api-access-lhxzr\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.403170 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.403025 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5f10649a-4fba-40f6-9e10-02d8301f5e9e-tmp-dir\") pod \"dns-default-bxl54\" (UID: \"5f10649a-4fba-40f6-9e10-02d8301f5e9e\") " pod="openshift-dns/dns-default-bxl54" Mar 18 16:45:05.403321 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.403307 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f10649a-4fba-40f6-9e10-02d8301f5e9e-config-volume\") pod \"dns-default-bxl54\" (UID: \"5f10649a-4fba-40f6-9e10-02d8301f5e9e\") " pod="openshift-dns/dns-default-bxl54" Mar 18 16:45:05.403378 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.403317 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5f10649a-4fba-40f6-9e10-02d8301f5e9e-tmp-dir\") pod \"dns-default-bxl54\" (UID: \"5f10649a-4fba-40f6-9e10-02d8301f5e9e\") " pod="openshift-dns/dns-default-bxl54" Mar 18 16:45:05.403378 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:05.403313 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:05.403473 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:05.403397 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls podName:5f10649a-4fba-40f6-9e10-02d8301f5e9e nodeName:}" failed. No retries permitted until 2026-03-18 16:45:05.903374851 +0000 UTC m=+33.618798173 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls") pod "dns-default-bxl54" (UID: "5f10649a-4fba-40f6-9e10-02d8301f5e9e") : secret "dns-default-metrics-tls" not found Mar 18 16:45:05.403537 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.403500 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/52134bb8-4757-4487-a5b9-38bdaea56506-registry-certificates\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.403735 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.403677 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/52134bb8-4757-4487-a5b9-38bdaea56506-ca-trust-extracted\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.404193 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.404170 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52134bb8-4757-4487-a5b9-38bdaea56506-trusted-ca\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.407436 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.407414 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/52134bb8-4757-4487-a5b9-38bdaea56506-installation-pull-secrets\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.407542 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.407414 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/52134bb8-4757-4487-a5b9-38bdaea56506-image-registry-private-configuration\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.412107 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.412085 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-bound-sa-token\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.414053 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.414008 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcqkb\" (UniqueName: \"kubernetes.io/projected/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-kube-api-access-gcqkb\") pod \"ingress-canary-7zqf5\" (UID: \"1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3\") " pod="openshift-ingress-canary/ingress-canary-7zqf5" Mar 18 16:45:05.415619 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.415592 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt9ww\" (UniqueName: \"kubernetes.io/projected/5f10649a-4fba-40f6-9e10-02d8301f5e9e-kube-api-access-wt9ww\") pod \"dns-default-bxl54\" (UID: \"5f10649a-4fba-40f6-9e10-02d8301f5e9e\") " pod="openshift-dns/dns-default-bxl54" Mar 18 16:45:05.415853 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.415837 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhxzr\" (UniqueName: \"kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-kube-api-access-lhxzr\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.907846 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.907803 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls\") pod \"dns-default-bxl54\" (UID: \"5f10649a-4fba-40f6-9e10-02d8301f5e9e\") " pod="openshift-dns/dns-default-bxl54" Mar 18 16:45:05.907846 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.907855 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:05.908318 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:05.907885 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert\") pod \"ingress-canary-7zqf5\" (UID: \"1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3\") " pod="openshift-ingress-canary/ingress-canary-7zqf5" Mar 18 16:45:05.908318 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:05.907959 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:05.908318 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:05.907984 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:05.908318 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:05.908037 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert podName:1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:06.908016196 +0000 UTC m=+34.623439173 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert") pod "ingress-canary-7zqf5" (UID: "1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3") : secret "canary-serving-cert" not found Mar 18 16:45:05.908318 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:05.908038 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:45:05.908318 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:05.908056 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7776559f4-s949p: secret "image-registry-tls" not found Mar 18 16:45:05.908318 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:05.908061 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls podName:5f10649a-4fba-40f6-9e10-02d8301f5e9e nodeName:}" failed. No retries permitted until 2026-03-18 16:45:06.908052056 +0000 UTC m=+34.623475042 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls") pod "dns-default-bxl54" (UID: "5f10649a-4fba-40f6-9e10-02d8301f5e9e") : secret "dns-default-metrics-tls" not found Mar 18 16:45:05.908318 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:05.908115 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls podName:52134bb8-4757-4487-a5b9-38bdaea56506 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:06.908095321 +0000 UTC m=+34.623518305 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls") pod "image-registry-7776559f4-s949p" (UID: "52134bb8-4757-4487-a5b9-38bdaea56506") : secret "image-registry-tls" not found Mar 18 16:45:06.512824 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:06.512780 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs\") pod \"network-metrics-daemon-wtbdl\" (UID: \"52f2a3f3-56d7-41f7-8bed-9e7229d96408\") " pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:45:06.513046 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:06.512900 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:45:06.513046 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:06.512990 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs podName:52f2a3f3-56d7-41f7-8bed-9e7229d96408 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:38.512972993 +0000 UTC m=+66.228395975 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs") pod "network-metrics-daemon-wtbdl" (UID: "52f2a3f3-56d7-41f7-8bed-9e7229d96408") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:45:06.715329 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:06.715288 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbbxq\" (UniqueName: \"kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq\") pod \"network-check-target-9smhv\" (UID: \"8b451861-a208-430f-840a-bce654bef71f\") " pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:45:06.715538 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:06.715482 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:45:06.715538 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:06.715512 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:45:06.715538 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:06.715526 2573 projected.go:194] Error preparing data for projected volume kube-api-access-hbbxq for pod openshift-network-diagnostics/network-check-target-9smhv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:45:06.715696 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:06.715594 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq podName:8b451861-a208-430f-840a-bce654bef71f nodeName:}" failed. No retries permitted until 2026-03-18 16:45:38.715573538 +0000 UTC m=+66.430996522 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-hbbxq" (UniqueName: "kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq") pod "network-check-target-9smhv" (UID: "8b451861-a208-430f-840a-bce654bef71f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:45:06.887059 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:06.886982 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:45:06.887059 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:06.887023 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:45:06.887265 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:06.886982 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:45:06.890353 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:06.890328 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 18 16:45:06.891691 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:06.891600 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-w8h72\"" Mar 18 16:45:06.891691 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:06.891646 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-t48rq\"" Mar 18 16:45:06.891691 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:06.891667 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Mar 18 16:45:06.891691 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:06.891688 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 18 16:45:06.892006 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:06.891667 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 18 16:45:06.917217 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:06.917180 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls\") pod \"dns-default-bxl54\" (UID: \"5f10649a-4fba-40f6-9e10-02d8301f5e9e\") " pod="openshift-dns/dns-default-bxl54" Mar 18 16:45:06.917695 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:06.917263 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:06.917695 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:06.917308 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert\") pod \"ingress-canary-7zqf5\" (UID: \"1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3\") " pod="openshift-ingress-canary/ingress-canary-7zqf5" Mar 18 16:45:06.917695 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:06.917351 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:06.917695 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:06.917381 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:45:06.917695 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:06.917399 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7776559f4-s949p: secret "image-registry-tls" not found Mar 18 16:45:06.917695 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:06.917428 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls podName:5f10649a-4fba-40f6-9e10-02d8301f5e9e nodeName:}" failed. No retries permitted until 2026-03-18 16:45:08.917406915 +0000 UTC m=+36.632829906 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls") pod "dns-default-bxl54" (UID: "5f10649a-4fba-40f6-9e10-02d8301f5e9e") : secret "dns-default-metrics-tls" not found Mar 18 16:45:06.917695 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:06.917435 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:06.917695 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:06.917454 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls podName:52134bb8-4757-4487-a5b9-38bdaea56506 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:08.917437688 +0000 UTC m=+36.632860672 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls") pod "image-registry-7776559f4-s949p" (UID: "52134bb8-4757-4487-a5b9-38bdaea56506") : secret "image-registry-tls" not found Mar 18 16:45:06.917695 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:06.917483 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert podName:1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:08.917474337 +0000 UTC m=+36.632897313 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert") pod "ingress-canary-7zqf5" (UID: "1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3") : secret "canary-serving-cert" not found Mar 18 16:45:08.935281 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:08.935242 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:08.936011 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:08.935297 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert\") pod \"ingress-canary-7zqf5\" (UID: \"1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3\") " pod="openshift-ingress-canary/ingress-canary-7zqf5" Mar 18 16:45:08.936011 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:08.935349 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls\") pod \"dns-default-bxl54\" (UID: \"5f10649a-4fba-40f6-9e10-02d8301f5e9e\") " pod="openshift-dns/dns-default-bxl54" Mar 18 16:45:08.936011 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:08.935404 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:45:08.936011 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:08.935427 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7776559f4-s949p: secret "image-registry-tls" not found Mar 18 16:45:08.936011 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:08.935449 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:08.936011 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:08.935448 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:08.936011 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:08.935488 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls podName:52134bb8-4757-4487-a5b9-38bdaea56506 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:12.935469123 +0000 UTC m=+40.650892118 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls") pod "image-registry-7776559f4-s949p" (UID: "52134bb8-4757-4487-a5b9-38bdaea56506") : secret "image-registry-tls" not found Mar 18 16:45:08.936011 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:08.935505 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert podName:1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:12.935493888 +0000 UTC m=+40.650916865 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert") pod "ingress-canary-7zqf5" (UID: "1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3") : secret "canary-serving-cert" not found Mar 18 16:45:08.936011 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:08.935518 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls podName:5f10649a-4fba-40f6-9e10-02d8301f5e9e nodeName:}" failed. No retries permitted until 2026-03-18 16:45:12.935511366 +0000 UTC m=+40.650934341 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls") pod "dns-default-bxl54" (UID: "5f10649a-4fba-40f6-9e10-02d8301f5e9e") : secret "dns-default-metrics-tls" not found Mar 18 16:45:09.036721 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:09.036686 2573 generic.go:358] "Generic (PLEG): container finished" podID="7b009b7f-519d-4077-b100-93c7b9934af9" containerID="86dd3e1e8c73168a6217e77e80192d732dedb33b88d23b62a9a6904826032d65" exitCode=0 Mar 18 16:45:09.036860 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:09.036738 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4npwk" event={"ID":"7b009b7f-519d-4077-b100-93c7b9934af9","Type":"ContainerDied","Data":"86dd3e1e8c73168a6217e77e80192d732dedb33b88d23b62a9a6904826032d65"} Mar 18 16:45:09.842697 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:09.842650 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ac3a60cb-dba0-4585-a37d-0402db777ed0-original-pull-secret\") pod \"global-pull-secret-syncer-xkzjh\" (UID: \"ac3a60cb-dba0-4585-a37d-0402db777ed0\") " pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:45:09.845196 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:09.845173 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ac3a60cb-dba0-4585-a37d-0402db777ed0-original-pull-secret\") pod \"global-pull-secret-syncer-xkzjh\" (UID: \"ac3a60cb-dba0-4585-a37d-0402db777ed0\") " pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:45:09.912207 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:09.912168 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkzjh" Mar 18 16:45:10.041783 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:10.041518 2573 generic.go:358] "Generic (PLEG): container finished" podID="7b009b7f-519d-4077-b100-93c7b9934af9" containerID="1410a015be382f0d04b1d67c7e5edf6d360f5dbc4369fdb29e51a953bed480bd" exitCode=0 Mar 18 16:45:10.042295 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:10.041817 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4npwk" event={"ID":"7b009b7f-519d-4077-b100-93c7b9934af9","Type":"ContainerDied","Data":"1410a015be382f0d04b1d67c7e5edf6d360f5dbc4369fdb29e51a953bed480bd"} Mar 18 16:45:10.053909 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:10.053876 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xkzjh"] Mar 18 16:45:10.057899 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:45:10.057873 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac3a60cb_dba0_4585_a37d_0402db777ed0.slice/crio-7f1e3aed917bf0fce17c49930b92ef9889f977f7601a8cfca9e04e57ffab998c WatchSource:0}: Error finding container 7f1e3aed917bf0fce17c49930b92ef9889f977f7601a8cfca9e04e57ffab998c: Status 404 returned error can't find the container with id 7f1e3aed917bf0fce17c49930b92ef9889f977f7601a8cfca9e04e57ffab998c Mar 18 16:45:11.048320 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:11.048282 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4npwk" event={"ID":"7b009b7f-519d-4077-b100-93c7b9934af9","Type":"ContainerStarted","Data":"d6949b507e1cd452264a22472fee8eddf4a1b2d4104e42257e298a1dc9c1d544"} Mar 18 16:45:11.049590 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:11.049564 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xkzjh" event={"ID":"ac3a60cb-dba0-4585-a37d-0402db777ed0","Type":"ContainerStarted","Data":"7f1e3aed917bf0fce17c49930b92ef9889f977f7601a8cfca9e04e57ffab998c"} Mar 18 16:45:11.073391 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:11.073327 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4npwk" podStartSLOduration=5.343385782 podStartE2EDuration="38.073305938s" podCreationTimestamp="2026-03-18 16:44:33 +0000 UTC" firstStartedPulling="2026-03-18 16:44:35.524861934 +0000 UTC m=+3.240284952" lastFinishedPulling="2026-03-18 16:45:08.254782114 +0000 UTC m=+35.970205108" observedRunningTime="2026-03-18 16:45:11.073199699 +0000 UTC m=+38.788622698" watchObservedRunningTime="2026-03-18 16:45:11.073305938 +0000 UTC m=+38.788728937" Mar 18 16:45:12.968895 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:12.968835 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls\") pod \"dns-default-bxl54\" (UID: \"5f10649a-4fba-40f6-9e10-02d8301f5e9e\") " pod="openshift-dns/dns-default-bxl54" Mar 18 16:45:12.969403 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:12.968956 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:12.969403 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:12.968996 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert\") pod \"ingress-canary-7zqf5\" (UID: \"1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3\") " pod="openshift-ingress-canary/ingress-canary-7zqf5" Mar 18 16:45:12.969403 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:12.969026 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:12.969403 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:12.969126 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls podName:5f10649a-4fba-40f6-9e10-02d8301f5e9e nodeName:}" failed. No retries permitted until 2026-03-18 16:45:20.969102271 +0000 UTC m=+48.684525259 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls") pod "dns-default-bxl54" (UID: "5f10649a-4fba-40f6-9e10-02d8301f5e9e") : secret "dns-default-metrics-tls" not found Mar 18 16:45:12.969403 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:12.969135 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:12.969403 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:12.969153 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:45:12.969403 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:12.969175 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7776559f4-s949p: secret "image-registry-tls" not found Mar 18 16:45:12.969403 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:12.969217 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert podName:1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:20.969199245 +0000 UTC m=+48.684622227 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert") pod "ingress-canary-7zqf5" (UID: "1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3") : secret "canary-serving-cert" not found Mar 18 16:45:12.969403 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:12.969237 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls podName:52134bb8-4757-4487-a5b9-38bdaea56506 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:20.969226554 +0000 UTC m=+48.684649544 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls") pod "image-registry-7776559f4-s949p" (UID: "52134bb8-4757-4487-a5b9-38bdaea56506") : secret "image-registry-tls" not found Mar 18 16:45:14.056868 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:14.056762 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xkzjh" event={"ID":"ac3a60cb-dba0-4585-a37d-0402db777ed0","Type":"ContainerStarted","Data":"f1dcf37f61ab5e60b7bf7a49f2021abd32930e4a0fdf41575cc15b7a6dd6ef97"} Mar 18 16:45:14.073476 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:14.073424 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-xkzjh" podStartSLOduration=32.336135008 podStartE2EDuration="36.073408492s" podCreationTimestamp="2026-03-18 16:44:38 +0000 UTC" firstStartedPulling="2026-03-18 16:45:10.059991623 +0000 UTC m=+37.775414598" lastFinishedPulling="2026-03-18 16:45:13.797265098 +0000 UTC m=+41.512688082" observedRunningTime="2026-03-18 16:45:14.072401013 +0000 UTC m=+41.787824014" watchObservedRunningTime="2026-03-18 16:45:14.073408492 +0000 UTC m=+41.788831489" Mar 18 16:45:21.029693 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:21.029646 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls\") pod \"dns-default-bxl54\" (UID: \"5f10649a-4fba-40f6-9e10-02d8301f5e9e\") " pod="openshift-dns/dns-default-bxl54" Mar 18 16:45:21.030269 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:21.029708 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:21.030269 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:21.029730 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert\") pod \"ingress-canary-7zqf5\" (UID: \"1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3\") " pod="openshift-ingress-canary/ingress-canary-7zqf5" Mar 18 16:45:21.030269 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:21.029831 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:21.030269 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:21.029839 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:21.030269 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:21.029897 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert podName:1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:37.029883952 +0000 UTC m=+64.745306928 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert") pod "ingress-canary-7zqf5" (UID: "1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3") : secret "canary-serving-cert" not found Mar 18 16:45:21.030269 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:21.029932 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls podName:5f10649a-4fba-40f6-9e10-02d8301f5e9e nodeName:}" failed. No retries permitted until 2026-03-18 16:45:37.029912944 +0000 UTC m=+64.745335927 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls") pod "dns-default-bxl54" (UID: "5f10649a-4fba-40f6-9e10-02d8301f5e9e") : secret "dns-default-metrics-tls" not found Mar 18 16:45:21.030269 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:21.029846 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:45:21.030269 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:21.029972 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7776559f4-s949p: secret "image-registry-tls" not found Mar 18 16:45:21.030269 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:21.030010 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls podName:52134bb8-4757-4487-a5b9-38bdaea56506 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:37.029999079 +0000 UTC m=+64.745422056 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls") pod "image-registry-7776559f4-s949p" (UID: "52134bb8-4757-4487-a5b9-38bdaea56506") : secret "image-registry-tls" not found Mar 18 16:45:31.029524 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:31.029493 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qzdnc" Mar 18 16:45:37.044313 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:37.044258 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:45:37.044313 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:37.044319 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert\") pod \"ingress-canary-7zqf5\" (UID: \"1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3\") " pod="openshift-ingress-canary/ingress-canary-7zqf5" Mar 18 16:45:37.044824 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:37.044395 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls\") pod \"dns-default-bxl54\" (UID: \"5f10649a-4fba-40f6-9e10-02d8301f5e9e\") " pod="openshift-dns/dns-default-bxl54" Mar 18 16:45:37.044824 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:37.044438 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:45:37.044824 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:37.044460 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:37.044824 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:37.044466 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7776559f4-s949p: secret "image-registry-tls" not found Mar 18 16:45:37.044824 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:37.044528 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls podName:52134bb8-4757-4487-a5b9-38bdaea56506 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:09.044510804 +0000 UTC m=+96.759933779 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls") pod "image-registry-7776559f4-s949p" (UID: "52134bb8-4757-4487-a5b9-38bdaea56506") : secret "image-registry-tls" not found Mar 18 16:45:37.044824 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:37.044536 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:37.044824 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:37.044543 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert podName:1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:09.044537743 +0000 UTC m=+96.759960719 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert") pod "ingress-canary-7zqf5" (UID: "1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3") : secret "canary-serving-cert" not found Mar 18 16:45:37.044824 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:37.044606 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls podName:5f10649a-4fba-40f6-9e10-02d8301f5e9e nodeName:}" failed. No retries permitted until 2026-03-18 16:46:09.044588406 +0000 UTC m=+96.760011389 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls") pod "dns-default-bxl54" (UID: "5f10649a-4fba-40f6-9e10-02d8301f5e9e") : secret "dns-default-metrics-tls" not found Mar 18 16:45:38.556999 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:38.556958 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs\") pod \"network-metrics-daemon-wtbdl\" (UID: \"52f2a3f3-56d7-41f7-8bed-9e7229d96408\") " pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:45:38.560029 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:38.560010 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 18 16:45:38.567546 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:38.567518 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 18 16:45:38.567615 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:45:38.567590 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs podName:52f2a3f3-56d7-41f7-8bed-9e7229d96408 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:42.567569344 +0000 UTC m=+130.282992320 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs") pod "network-metrics-daemon-wtbdl" (UID: "52f2a3f3-56d7-41f7-8bed-9e7229d96408") : secret "metrics-daemon-secret" not found Mar 18 16:45:38.759151 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:38.759105 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbbxq\" (UniqueName: \"kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq\") pod \"network-check-target-9smhv\" (UID: \"8b451861-a208-430f-840a-bce654bef71f\") " pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:45:38.762095 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:38.762073 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 18 16:45:38.772473 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:38.772450 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 18 16:45:38.784276 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:38.784253 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbbxq\" (UniqueName: \"kubernetes.io/projected/8b451861-a208-430f-840a-bce654bef71f-kube-api-access-hbbxq\") pod \"network-check-target-9smhv\" (UID: \"8b451861-a208-430f-840a-bce654bef71f\") " pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:45:39.010385 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:39.010358 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-t48rq\"" Mar 18 16:45:39.017882 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:39.017839 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:45:39.164813 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:39.164780 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9smhv"] Mar 18 16:45:39.167780 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:45:39.167753 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b451861_a208_430f_840a_bce654bef71f.slice/crio-c75737968a04682894eb4cb14c58c4c8b4b625ad49a58635c5f6c902bf3b7bca WatchSource:0}: Error finding container c75737968a04682894eb4cb14c58c4c8b4b625ad49a58635c5f6c902bf3b7bca: Status 404 returned error can't find the container with id c75737968a04682894eb4cb14c58c4c8b4b625ad49a58635c5f6c902bf3b7bca Mar 18 16:45:40.109986 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:40.109933 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9smhv" event={"ID":"8b451861-a208-430f-840a-bce654bef71f","Type":"ContainerStarted","Data":"c75737968a04682894eb4cb14c58c4c8b4b625ad49a58635c5f6c902bf3b7bca"} Mar 18 16:45:42.115402 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:42.115304 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9smhv" event={"ID":"8b451861-a208-430f-840a-bce654bef71f","Type":"ContainerStarted","Data":"7ad84c8b433c1b904e80716cac7dd6bdad94523df75df02c6a9b84eb63a0f183"} Mar 18 16:45:42.115775 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:42.115416 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:45:42.131554 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:45:42.131491 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-9smhv" podStartSLOduration=66.475352731 podStartE2EDuration="1m9.131475541s" podCreationTimestamp="2026-03-18 16:44:33 +0000 UTC" firstStartedPulling="2026-03-18 16:45:39.169709937 +0000 UTC m=+66.885132926" lastFinishedPulling="2026-03-18 16:45:41.825832742 +0000 UTC m=+69.541255736" observedRunningTime="2026-03-18 16:45:42.130816139 +0000 UTC m=+69.846239136" watchObservedRunningTime="2026-03-18 16:45:42.131475541 +0000 UTC m=+69.846898558" Mar 18 16:46:09.080720 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:09.080670 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls\") pod \"dns-default-bxl54\" (UID: \"5f10649a-4fba-40f6-9e10-02d8301f5e9e\") " pod="openshift-dns/dns-default-bxl54" Mar 18 16:46:09.081213 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:09.080737 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:46:09.081213 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:09.080767 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert\") pod \"ingress-canary-7zqf5\" (UID: \"1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3\") " pod="openshift-ingress-canary/ingress-canary-7zqf5" Mar 18 16:46:09.081213 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:46:09.080862 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:46:09.081213 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:46:09.080874 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:46:09.081213 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:46:09.080898 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7776559f4-s949p: secret "image-registry-tls" not found Mar 18 16:46:09.081213 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:46:09.080922 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert podName:1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3 nodeName:}" failed. No retries permitted until 2026-03-18 16:47:13.080899391 +0000 UTC m=+160.796322367 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert") pod "ingress-canary-7zqf5" (UID: "1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3") : secret "canary-serving-cert" not found Mar 18 16:46:09.081213 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:46:09.080877 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:46:09.081213 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:46:09.080982 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls podName:52134bb8-4757-4487-a5b9-38bdaea56506 nodeName:}" failed. No retries permitted until 2026-03-18 16:47:13.080966825 +0000 UTC m=+160.796389817 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls") pod "image-registry-7776559f4-s949p" (UID: "52134bb8-4757-4487-a5b9-38bdaea56506") : secret "image-registry-tls" not found Mar 18 16:46:09.081213 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:46:09.081031 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls podName:5f10649a-4fba-40f6-9e10-02d8301f5e9e nodeName:}" failed. No retries permitted until 2026-03-18 16:47:13.081012519 +0000 UTC m=+160.796435506 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls") pod "dns-default-bxl54" (UID: "5f10649a-4fba-40f6-9e10-02d8301f5e9e") : secret "dns-default-metrics-tls" not found Mar 18 16:46:13.120445 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:13.120414 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-9smhv" Mar 18 16:46:42.625021 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:42.624965 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs\") pod \"network-metrics-daemon-wtbdl\" (UID: \"52f2a3f3-56d7-41f7-8bed-9e7229d96408\") " pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:46:42.625451 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:46:42.625118 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 18 16:46:42.625451 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:46:42.625196 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs podName:52f2a3f3-56d7-41f7-8bed-9e7229d96408 nodeName:}" failed. No retries permitted until 2026-03-18 16:48:44.625180967 +0000 UTC m=+252.340603943 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs") pod "network-metrics-daemon-wtbdl" (UID: "52f2a3f3-56d7-41f7-8bed-9e7229d96408") : secret "metrics-daemon-secret" not found Mar 18 16:46:49.449572 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.449528 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-l4zpq"] Mar 18 16:46:49.452733 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.452709 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-l4zpq" Mar 18 16:46:49.453255 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.453231 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xp98c"] Mar 18 16:46:49.455091 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.455067 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Mar 18 16:46:49.455242 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.455073 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-n8dng\"" Mar 18 16:46:49.456205 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.456185 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xp98c" Mar 18 16:46:49.456205 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.456199 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Mar 18 16:46:49.456351 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.456217 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:46:49.456351 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.456222 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Mar 18 16:46:49.458732 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.458717 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-9jhwp\"" Mar 18 16:46:49.459044 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.459026 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Mar 18 16:46:49.459208 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.459033 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:46:49.459337 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.459323 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Mar 18 16:46:49.459408 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.459356 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Mar 18 16:46:49.463650 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.463632 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-l4zpq"] Mar 18 16:46:49.467349 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.467330 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xp98c"] Mar 18 16:46:49.472256 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.472231 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/847a4bfe-1ce5-4bcc-bd8c-f62cb630b993-config\") pod \"service-ca-operator-56f6f4cbcb-xp98c\" (UID: \"847a4bfe-1ce5-4bcc-bd8c-f62cb630b993\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xp98c" Mar 18 16:46:49.472356 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.472299 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5757757-50ec-49cc-93fb-20785bb506cb-config\") pod \"kube-storage-version-migrator-operator-866f46547-l4zpq\" (UID: \"b5757757-50ec-49cc-93fb-20785bb506cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-l4zpq" Mar 18 16:46:49.472356 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.472331 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjw68\" (UniqueName: \"kubernetes.io/projected/b5757757-50ec-49cc-93fb-20785bb506cb-kube-api-access-sjw68\") pod \"kube-storage-version-migrator-operator-866f46547-l4zpq\" (UID: \"b5757757-50ec-49cc-93fb-20785bb506cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-l4zpq" Mar 18 16:46:49.472437 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.472376 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwkmf\" (UniqueName: \"kubernetes.io/projected/847a4bfe-1ce5-4bcc-bd8c-f62cb630b993-kube-api-access-vwkmf\") pod \"service-ca-operator-56f6f4cbcb-xp98c\" (UID: \"847a4bfe-1ce5-4bcc-bd8c-f62cb630b993\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xp98c" Mar 18 16:46:49.472437 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.472420 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5757757-50ec-49cc-93fb-20785bb506cb-serving-cert\") pod \"kube-storage-version-migrator-operator-866f46547-l4zpq\" (UID: \"b5757757-50ec-49cc-93fb-20785bb506cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-l4zpq" Mar 18 16:46:49.472496 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.472449 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/847a4bfe-1ce5-4bcc-bd8c-f62cb630b993-serving-cert\") pod \"service-ca-operator-56f6f4cbcb-xp98c\" (UID: \"847a4bfe-1ce5-4bcc-bd8c-f62cb630b993\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xp98c" Mar 18 16:46:49.573292 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.573246 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/847a4bfe-1ce5-4bcc-bd8c-f62cb630b993-config\") pod \"service-ca-operator-56f6f4cbcb-xp98c\" (UID: \"847a4bfe-1ce5-4bcc-bd8c-f62cb630b993\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xp98c" Mar 18 16:46:49.573475 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.573340 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5757757-50ec-49cc-93fb-20785bb506cb-config\") pod \"kube-storage-version-migrator-operator-866f46547-l4zpq\" (UID: \"b5757757-50ec-49cc-93fb-20785bb506cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-l4zpq" Mar 18 16:46:49.573475 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.573371 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjw68\" (UniqueName: \"kubernetes.io/projected/b5757757-50ec-49cc-93fb-20785bb506cb-kube-api-access-sjw68\") pod \"kube-storage-version-migrator-operator-866f46547-l4zpq\" (UID: \"b5757757-50ec-49cc-93fb-20785bb506cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-l4zpq" Mar 18 16:46:49.573475 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.573396 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwkmf\" (UniqueName: \"kubernetes.io/projected/847a4bfe-1ce5-4bcc-bd8c-f62cb630b993-kube-api-access-vwkmf\") pod \"service-ca-operator-56f6f4cbcb-xp98c\" (UID: \"847a4bfe-1ce5-4bcc-bd8c-f62cb630b993\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xp98c" Mar 18 16:46:49.573475 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.573425 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5757757-50ec-49cc-93fb-20785bb506cb-serving-cert\") pod \"kube-storage-version-migrator-operator-866f46547-l4zpq\" (UID: \"b5757757-50ec-49cc-93fb-20785bb506cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-l4zpq" Mar 18 16:46:49.573653 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.573546 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/847a4bfe-1ce5-4bcc-bd8c-f62cb630b993-serving-cert\") pod \"service-ca-operator-56f6f4cbcb-xp98c\" (UID: \"847a4bfe-1ce5-4bcc-bd8c-f62cb630b993\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xp98c" Mar 18 16:46:49.573969 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.573921 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5757757-50ec-49cc-93fb-20785bb506cb-config\") pod \"kube-storage-version-migrator-operator-866f46547-l4zpq\" (UID: \"b5757757-50ec-49cc-93fb-20785bb506cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-l4zpq" Mar 18 16:46:49.573969 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.573961 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/847a4bfe-1ce5-4bcc-bd8c-f62cb630b993-config\") pod \"service-ca-operator-56f6f4cbcb-xp98c\" (UID: \"847a4bfe-1ce5-4bcc-bd8c-f62cb630b993\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xp98c" Mar 18 16:46:49.575743 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.575725 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/847a4bfe-1ce5-4bcc-bd8c-f62cb630b993-serving-cert\") pod \"service-ca-operator-56f6f4cbcb-xp98c\" (UID: \"847a4bfe-1ce5-4bcc-bd8c-f62cb630b993\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xp98c" Mar 18 16:46:49.575743 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.575732 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5757757-50ec-49cc-93fb-20785bb506cb-serving-cert\") pod \"kube-storage-version-migrator-operator-866f46547-l4zpq\" (UID: \"b5757757-50ec-49cc-93fb-20785bb506cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-l4zpq" Mar 18 16:46:49.582893 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.582872 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwkmf\" (UniqueName: \"kubernetes.io/projected/847a4bfe-1ce5-4bcc-bd8c-f62cb630b993-kube-api-access-vwkmf\") pod \"service-ca-operator-56f6f4cbcb-xp98c\" (UID: \"847a4bfe-1ce5-4bcc-bd8c-f62cb630b993\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xp98c" Mar 18 16:46:49.583015 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.582928 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjw68\" (UniqueName: \"kubernetes.io/projected/b5757757-50ec-49cc-93fb-20785bb506cb-kube-api-access-sjw68\") pod \"kube-storage-version-migrator-operator-866f46547-l4zpq\" (UID: \"b5757757-50ec-49cc-93fb-20785bb506cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-l4zpq" Mar 18 16:46:49.764473 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.764360 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-l4zpq" Mar 18 16:46:49.769200 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.769167 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xp98c" Mar 18 16:46:49.895846 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.895816 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-l4zpq"] Mar 18 16:46:49.899809 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:46:49.899781 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5757757_50ec_49cc_93fb_20785bb506cb.slice/crio-d95f84bda6bad17fde4f2a25404d518d303de1eb60eecf3998b71fc25047b880 WatchSource:0}: Error finding container d95f84bda6bad17fde4f2a25404d518d303de1eb60eecf3998b71fc25047b880: Status 404 returned error can't find the container with id d95f84bda6bad17fde4f2a25404d518d303de1eb60eecf3998b71fc25047b880 Mar 18 16:46:49.909958 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:49.909915 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xp98c"] Mar 18 16:46:49.913072 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:46:49.913045 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod847a4bfe_1ce5_4bcc_bd8c_f62cb630b993.slice/crio-2965aa719f903f09030247a91c02211241bca21a5ded1389f8a9db1f4cf540c6 WatchSource:0}: Error finding container 2965aa719f903f09030247a91c02211241bca21a5ded1389f8a9db1f4cf540c6: Status 404 returned error can't find the container with id 2965aa719f903f09030247a91c02211241bca21a5ded1389f8a9db1f4cf540c6 Mar 18 16:46:50.245182 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:50.245142 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xp98c" event={"ID":"847a4bfe-1ce5-4bcc-bd8c-f62cb630b993","Type":"ContainerStarted","Data":"2965aa719f903f09030247a91c02211241bca21a5ded1389f8a9db1f4cf540c6"} Mar 18 16:46:50.246015 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:50.245993 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-l4zpq" event={"ID":"b5757757-50ec-49cc-93fb-20785bb506cb","Type":"ContainerStarted","Data":"d95f84bda6bad17fde4f2a25404d518d303de1eb60eecf3998b71fc25047b880"} Mar 18 16:46:52.252646 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:52.252607 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xp98c" event={"ID":"847a4bfe-1ce5-4bcc-bd8c-f62cb630b993","Type":"ContainerStarted","Data":"e66d20c2d00d2536a7379774712fcdeb3681efbc043c7e1ba3a50ef3c014195f"} Mar 18 16:46:52.253974 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:52.253935 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-l4zpq" event={"ID":"b5757757-50ec-49cc-93fb-20785bb506cb","Type":"ContainerStarted","Data":"08cd55cb452215c2d45af4ddfbb32764f16b631a40f01f332c8df13253f2ee9f"} Mar 18 16:46:52.270217 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:52.270174 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xp98c" podStartSLOduration=1.128056745 podStartE2EDuration="3.270158669s" podCreationTimestamp="2026-03-18 16:46:49 +0000 UTC" firstStartedPulling="2026-03-18 16:46:49.915081816 +0000 UTC m=+137.630504793" lastFinishedPulling="2026-03-18 16:46:52.057183729 +0000 UTC m=+139.772606717" observedRunningTime="2026-03-18 16:46:52.269556901 +0000 UTC m=+139.984979900" watchObservedRunningTime="2026-03-18 16:46:52.270158669 +0000 UTC m=+139.985581667" Mar 18 16:46:52.286560 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:52.286511 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-l4zpq" podStartSLOduration=1.133202436 podStartE2EDuration="3.286496803s" podCreationTimestamp="2026-03-18 16:46:49 +0000 UTC" firstStartedPulling="2026-03-18 16:46:49.901566023 +0000 UTC m=+137.616989000" lastFinishedPulling="2026-03-18 16:46:52.054860376 +0000 UTC m=+139.770283367" observedRunningTime="2026-03-18 16:46:52.285481287 +0000 UTC m=+140.000904286" watchObservedRunningTime="2026-03-18 16:46:52.286496803 +0000 UTC m=+140.001919800" Mar 18 16:46:55.050634 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:55.050605 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8dd92_ef1bc9f1-f14a-4cb6-8631-d7ab65c971c7/dns-node-resolver/0.log" Mar 18 16:46:55.872010 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:55.871977 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-8bb587b94-pglpg"] Mar 18 16:46:55.875035 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:55.875016 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-8bb587b94-pglpg" Mar 18 16:46:55.877871 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:55.877845 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Mar 18 16:46:55.878010 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:55.877901 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Mar 18 16:46:55.878010 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:55.877845 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Mar 18 16:46:55.879039 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:55.879016 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Mar 18 16:46:55.879039 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:55.879017 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-hx2js\"" Mar 18 16:46:55.882769 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:55.882746 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-8bb587b94-pglpg"] Mar 18 16:46:55.921900 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:55.921861 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwrjk\" (UniqueName: \"kubernetes.io/projected/725a2751-b38a-4590-9b8c-8218d00b7301-kube-api-access-bwrjk\") pod \"service-ca-8bb587b94-pglpg\" (UID: \"725a2751-b38a-4590-9b8c-8218d00b7301\") " pod="openshift-service-ca/service-ca-8bb587b94-pglpg" Mar 18 16:46:55.922109 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:55.921983 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/725a2751-b38a-4590-9b8c-8218d00b7301-signing-cabundle\") pod \"service-ca-8bb587b94-pglpg\" (UID: \"725a2751-b38a-4590-9b8c-8218d00b7301\") " pod="openshift-service-ca/service-ca-8bb587b94-pglpg" Mar 18 16:46:55.922109 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:55.922016 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/725a2751-b38a-4590-9b8c-8218d00b7301-signing-key\") pod \"service-ca-8bb587b94-pglpg\" (UID: \"725a2751-b38a-4590-9b8c-8218d00b7301\") " pod="openshift-service-ca/service-ca-8bb587b94-pglpg" Mar 18 16:46:56.022918 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:56.022883 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/725a2751-b38a-4590-9b8c-8218d00b7301-signing-cabundle\") pod \"service-ca-8bb587b94-pglpg\" (UID: \"725a2751-b38a-4590-9b8c-8218d00b7301\") " pod="openshift-service-ca/service-ca-8bb587b94-pglpg" Mar 18 16:46:56.023115 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:56.022925 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/725a2751-b38a-4590-9b8c-8218d00b7301-signing-key\") pod \"service-ca-8bb587b94-pglpg\" (UID: \"725a2751-b38a-4590-9b8c-8218d00b7301\") " pod="openshift-service-ca/service-ca-8bb587b94-pglpg" Mar 18 16:46:56.023115 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:56.022982 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bwrjk\" (UniqueName: \"kubernetes.io/projected/725a2751-b38a-4590-9b8c-8218d00b7301-kube-api-access-bwrjk\") pod \"service-ca-8bb587b94-pglpg\" (UID: \"725a2751-b38a-4590-9b8c-8218d00b7301\") " pod="openshift-service-ca/service-ca-8bb587b94-pglpg" Mar 18 16:46:56.023571 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:56.023539 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/725a2751-b38a-4590-9b8c-8218d00b7301-signing-cabundle\") pod \"service-ca-8bb587b94-pglpg\" (UID: \"725a2751-b38a-4590-9b8c-8218d00b7301\") " pod="openshift-service-ca/service-ca-8bb587b94-pglpg" Mar 18 16:46:56.025352 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:56.025330 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/725a2751-b38a-4590-9b8c-8218d00b7301-signing-key\") pod \"service-ca-8bb587b94-pglpg\" (UID: \"725a2751-b38a-4590-9b8c-8218d00b7301\") " pod="openshift-service-ca/service-ca-8bb587b94-pglpg" Mar 18 16:46:56.032015 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:56.031985 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwrjk\" (UniqueName: \"kubernetes.io/projected/725a2751-b38a-4590-9b8c-8218d00b7301-kube-api-access-bwrjk\") pod \"service-ca-8bb587b94-pglpg\" (UID: \"725a2751-b38a-4590-9b8c-8218d00b7301\") " pod="openshift-service-ca/service-ca-8bb587b94-pglpg" Mar 18 16:46:56.050759 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:56.050738 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fxsfb_eb98c083-6ae5-4745-a6be-ff841741f1f6/node-ca/0.log" Mar 18 16:46:56.185094 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:56.185063 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-8bb587b94-pglpg" Mar 18 16:46:56.304313 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:56.304281 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-8bb587b94-pglpg"] Mar 18 16:46:56.307620 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:46:56.307591 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod725a2751_b38a_4590_9b8c_8218d00b7301.slice/crio-89b37b9830bbbe76a1bec5786ef38aeb3a85128712bdf016df127140f21ef02d WatchSource:0}: Error finding container 89b37b9830bbbe76a1bec5786ef38aeb3a85128712bdf016df127140f21ef02d: Status 404 returned error can't find the container with id 89b37b9830bbbe76a1bec5786ef38aeb3a85128712bdf016df127140f21ef02d Mar 18 16:46:57.265514 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:57.265477 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-8bb587b94-pglpg" event={"ID":"725a2751-b38a-4590-9b8c-8218d00b7301","Type":"ContainerStarted","Data":"97ef42d6ef1bbaace0daa54d03e0c3c0e5f387e692eac99213096fa2a15e54e4"} Mar 18 16:46:57.265514 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:57.265518 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-8bb587b94-pglpg" event={"ID":"725a2751-b38a-4590-9b8c-8218d00b7301","Type":"ContainerStarted","Data":"89b37b9830bbbe76a1bec5786ef38aeb3a85128712bdf016df127140f21ef02d"} Mar 18 16:46:57.284395 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:46:57.284346 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-8bb587b94-pglpg" podStartSLOduration=2.2843302100000002 podStartE2EDuration="2.28433021s" podCreationTimestamp="2026-03-18 16:46:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:46:57.283527232 +0000 UTC m=+144.998950232" watchObservedRunningTime="2026-03-18 16:46:57.28433021 +0000 UTC m=+144.999753207" Mar 18 16:47:08.164053 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:47:08.164007 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-7776559f4-s949p" podUID="52134bb8-4757-4487-a5b9-38bdaea56506" Mar 18 16:47:08.177134 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:47:08.177106 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-7zqf5" podUID="1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3" Mar 18 16:47:08.201442 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:47:08.201412 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-bxl54" podUID="5f10649a-4fba-40f6-9e10-02d8301f5e9e" Mar 18 16:47:08.293545 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:08.293515 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7zqf5" Mar 18 16:47:09.900093 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:47:09.900029 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-wtbdl" podUID="52f2a3f3-56d7-41f7-8bed-9e7229d96408" Mar 18 16:47:13.150408 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:13.150366 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert\") pod \"ingress-canary-7zqf5\" (UID: \"1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3\") " pod="openshift-ingress-canary/ingress-canary-7zqf5" Mar 18 16:47:13.150919 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:13.150425 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls\") pod \"dns-default-bxl54\" (UID: \"5f10649a-4fba-40f6-9e10-02d8301f5e9e\") " pod="openshift-dns/dns-default-bxl54" Mar 18 16:47:13.150919 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:13.150464 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:47:13.152929 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:13.152898 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f10649a-4fba-40f6-9e10-02d8301f5e9e-metrics-tls\") pod \"dns-default-bxl54\" (UID: \"5f10649a-4fba-40f6-9e10-02d8301f5e9e\") " pod="openshift-dns/dns-default-bxl54" Mar 18 16:47:13.153065 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:13.153051 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls\") pod \"image-registry-7776559f4-s949p\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:47:13.153102 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:13.153051 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3-cert\") pod \"ingress-canary-7zqf5\" (UID: \"1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3\") " pod="openshift-ingress-canary/ingress-canary-7zqf5" Mar 18 16:47:13.396771 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:13.396737 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dz9s5\"" Mar 18 16:47:13.404967 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:13.404873 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7zqf5" Mar 18 16:47:13.527860 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:13.527826 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7zqf5"] Mar 18 16:47:13.530770 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:47:13.530734 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a4fd5d2_e5ee_4db9_9545_6baa4c7e38b3.slice/crio-65f4b11086f88dc2253c5ef5bc2dfa12dc245549e9d1ab4f06ee95d487df9c1f WatchSource:0}: Error finding container 65f4b11086f88dc2253c5ef5bc2dfa12dc245549e9d1ab4f06ee95d487df9c1f: Status 404 returned error can't find the container with id 65f4b11086f88dc2253c5ef5bc2dfa12dc245549e9d1ab4f06ee95d487df9c1f Mar 18 16:47:14.312737 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:14.312694 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7zqf5" event={"ID":"1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3","Type":"ContainerStarted","Data":"65f4b11086f88dc2253c5ef5bc2dfa12dc245549e9d1ab4f06ee95d487df9c1f"} Mar 18 16:47:15.316577 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:15.316493 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7zqf5" event={"ID":"1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3","Type":"ContainerStarted","Data":"603104f05f6e8d00809b20ef9222a384d2eca55dac8ad8fb323a21b73b3b24e1"} Mar 18 16:47:15.333418 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:15.333340 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7zqf5" podStartSLOduration=128.824140783 podStartE2EDuration="2m10.333324631s" podCreationTimestamp="2026-03-18 16:45:05 +0000 UTC" firstStartedPulling="2026-03-18 16:47:13.532689034 +0000 UTC m=+161.248112010" lastFinishedPulling="2026-03-18 16:47:15.041872877 +0000 UTC m=+162.757295858" observedRunningTime="2026-03-18 16:47:15.332685192 +0000 UTC m=+163.048108210" watchObservedRunningTime="2026-03-18 16:47:15.333324631 +0000 UTC m=+163.048747819" Mar 18 16:47:18.883616 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:18.883574 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bxl54" Mar 18 16:47:18.886687 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:18.886659 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wnp77\"" Mar 18 16:47:18.894699 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:18.894677 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bxl54" Mar 18 16:47:19.027168 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.027142 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bxl54"] Mar 18 16:47:19.029555 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:47:19.029524 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f10649a_4fba_40f6_9e10_02d8301f5e9e.slice/crio-2013cd39e4bc660952c39b5da356a2a7ec595014d7c0495e781aba9054006a35 WatchSource:0}: Error finding container 2013cd39e4bc660952c39b5da356a2a7ec595014d7c0495e781aba9054006a35: Status 404 returned error can't find the container with id 2013cd39e4bc660952c39b5da356a2a7ec595014d7c0495e781aba9054006a35 Mar 18 16:47:19.230549 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.230509 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-fq5mw"] Mar 18 16:47:19.235023 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.234996 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fq5mw" Mar 18 16:47:19.248931 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.248907 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Mar 18 16:47:19.260739 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.260706 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Mar 18 16:47:19.263315 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.263296 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Mar 18 16:47:19.263578 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.263560 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-fhwjh\"" Mar 18 16:47:19.264371 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.264351 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Mar 18 16:47:19.284163 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.284128 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fq5mw"] Mar 18 16:47:19.300789 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.300753 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/278983c0-c182-4da5-a2f5-9f699fb47ede-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fq5mw\" (UID: \"278983c0-c182-4da5-a2f5-9f699fb47ede\") " pod="openshift-insights/insights-runtime-extractor-fq5mw" Mar 18 16:47:19.300990 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.300796 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/278983c0-c182-4da5-a2f5-9f699fb47ede-data-volume\") pod \"insights-runtime-extractor-fq5mw\" (UID: \"278983c0-c182-4da5-a2f5-9f699fb47ede\") " pod="openshift-insights/insights-runtime-extractor-fq5mw" Mar 18 16:47:19.300990 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.300858 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/278983c0-c182-4da5-a2f5-9f699fb47ede-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fq5mw\" (UID: \"278983c0-c182-4da5-a2f5-9f699fb47ede\") " pod="openshift-insights/insights-runtime-extractor-fq5mw" Mar 18 16:47:19.300990 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.300886 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/278983c0-c182-4da5-a2f5-9f699fb47ede-crio-socket\") pod \"insights-runtime-extractor-fq5mw\" (UID: \"278983c0-c182-4da5-a2f5-9f699fb47ede\") " pod="openshift-insights/insights-runtime-extractor-fq5mw" Mar 18 16:47:19.300990 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.300929 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wjqr\" (UniqueName: \"kubernetes.io/projected/278983c0-c182-4da5-a2f5-9f699fb47ede-kube-api-access-2wjqr\") pod \"insights-runtime-extractor-fq5mw\" (UID: \"278983c0-c182-4da5-a2f5-9f699fb47ede\") " pod="openshift-insights/insights-runtime-extractor-fq5mw" Mar 18 16:47:19.327036 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.327004 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bxl54" event={"ID":"5f10649a-4fba-40f6-9e10-02d8301f5e9e","Type":"ContainerStarted","Data":"2013cd39e4bc660952c39b5da356a2a7ec595014d7c0495e781aba9054006a35"} Mar 18 16:47:19.401753 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.401719 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/278983c0-c182-4da5-a2f5-9f699fb47ede-data-volume\") pod \"insights-runtime-extractor-fq5mw\" (UID: \"278983c0-c182-4da5-a2f5-9f699fb47ede\") " pod="openshift-insights/insights-runtime-extractor-fq5mw" Mar 18 16:47:19.401922 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.401765 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/278983c0-c182-4da5-a2f5-9f699fb47ede-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fq5mw\" (UID: \"278983c0-c182-4da5-a2f5-9f699fb47ede\") " pod="openshift-insights/insights-runtime-extractor-fq5mw" Mar 18 16:47:19.401922 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.401815 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/278983c0-c182-4da5-a2f5-9f699fb47ede-crio-socket\") pod \"insights-runtime-extractor-fq5mw\" (UID: \"278983c0-c182-4da5-a2f5-9f699fb47ede\") " pod="openshift-insights/insights-runtime-extractor-fq5mw" Mar 18 16:47:19.401922 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.401868 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wjqr\" (UniqueName: \"kubernetes.io/projected/278983c0-c182-4da5-a2f5-9f699fb47ede-kube-api-access-2wjqr\") pod \"insights-runtime-extractor-fq5mw\" (UID: \"278983c0-c182-4da5-a2f5-9f699fb47ede\") " pod="openshift-insights/insights-runtime-extractor-fq5mw" Mar 18 16:47:19.401922 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.401901 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/278983c0-c182-4da5-a2f5-9f699fb47ede-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fq5mw\" (UID: \"278983c0-c182-4da5-a2f5-9f699fb47ede\") " pod="openshift-insights/insights-runtime-extractor-fq5mw" Mar 18 16:47:19.402102 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.401969 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/278983c0-c182-4da5-a2f5-9f699fb47ede-crio-socket\") pod \"insights-runtime-extractor-fq5mw\" (UID: \"278983c0-c182-4da5-a2f5-9f699fb47ede\") " pod="openshift-insights/insights-runtime-extractor-fq5mw" Mar 18 16:47:19.402161 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.402141 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/278983c0-c182-4da5-a2f5-9f699fb47ede-data-volume\") pod \"insights-runtime-extractor-fq5mw\" (UID: \"278983c0-c182-4da5-a2f5-9f699fb47ede\") " pod="openshift-insights/insights-runtime-extractor-fq5mw" Mar 18 16:47:19.403037 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.403017 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/278983c0-c182-4da5-a2f5-9f699fb47ede-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fq5mw\" (UID: \"278983c0-c182-4da5-a2f5-9f699fb47ede\") " pod="openshift-insights/insights-runtime-extractor-fq5mw" Mar 18 16:47:19.404231 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.404213 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/278983c0-c182-4da5-a2f5-9f699fb47ede-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fq5mw\" (UID: \"278983c0-c182-4da5-a2f5-9f699fb47ede\") " pod="openshift-insights/insights-runtime-extractor-fq5mw" Mar 18 16:47:19.439695 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.439666 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wjqr\" (UniqueName: \"kubernetes.io/projected/278983c0-c182-4da5-a2f5-9f699fb47ede-kube-api-access-2wjqr\") pod \"insights-runtime-extractor-fq5mw\" (UID: \"278983c0-c182-4da5-a2f5-9f699fb47ede\") " pod="openshift-insights/insights-runtime-extractor-fq5mw" Mar 18 16:47:19.544599 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.544485 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fq5mw" Mar 18 16:47:19.716312 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:47:19.716274 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod278983c0_c182_4da5_a2f5_9f699fb47ede.slice/crio-5a71dc2aebd39557eeb7041ef9afbef75bae4cdc758923f0524fc62aa680c1ce WatchSource:0}: Error finding container 5a71dc2aebd39557eeb7041ef9afbef75bae4cdc758923f0524fc62aa680c1ce: Status 404 returned error can't find the container with id 5a71dc2aebd39557eeb7041ef9afbef75bae4cdc758923f0524fc62aa680c1ce Mar 18 16:47:19.717766 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:19.717719 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fq5mw"] Mar 18 16:47:20.332106 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:20.332062 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fq5mw" event={"ID":"278983c0-c182-4da5-a2f5-9f699fb47ede","Type":"ContainerStarted","Data":"9a257d41ad7dbc93762d97021690d692949955f34ebbdb4251f17e0592327f3e"} Mar 18 16:47:20.332579 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:20.332117 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fq5mw" event={"ID":"278983c0-c182-4da5-a2f5-9f699fb47ede","Type":"ContainerStarted","Data":"5a71dc2aebd39557eeb7041ef9afbef75bae4cdc758923f0524fc62aa680c1ce"} Mar 18 16:47:21.336851 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:21.336810 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fq5mw" event={"ID":"278983c0-c182-4da5-a2f5-9f699fb47ede","Type":"ContainerStarted","Data":"f17e4d611ae9f4a1c75a1c1f0bd61d30386397b9e6acb800d1596fea50bd93a5"} Mar 18 16:47:21.338603 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:21.338573 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bxl54" event={"ID":"5f10649a-4fba-40f6-9e10-02d8301f5e9e","Type":"ContainerStarted","Data":"884dbc527b8108e4f33162a98f208aca7c82d43d73e716a0ac2db6de82ac2a44"} Mar 18 16:47:21.338720 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:21.338610 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bxl54" event={"ID":"5f10649a-4fba-40f6-9e10-02d8301f5e9e","Type":"ContainerStarted","Data":"d2628fbf30f2b713565c06570854c6db28b0a22c8cf956854f810b974f8f0fb1"} Mar 18 16:47:21.338792 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:21.338773 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-bxl54" Mar 18 16:47:21.884137 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:21.884096 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:47:22.342984 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:22.342932 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fq5mw" event={"ID":"278983c0-c182-4da5-a2f5-9f699fb47ede","Type":"ContainerStarted","Data":"2f01c564b8520e95853a8e236246b48e59add99139b3cda1483751e617e11c0d"} Mar 18 16:47:22.394068 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:22.394012 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-bxl54" podStartSLOduration=135.945465011 podStartE2EDuration="2m17.393994411s" podCreationTimestamp="2026-03-18 16:45:05 +0000 UTC" firstStartedPulling="2026-03-18 16:47:19.031587394 +0000 UTC m=+166.747010370" lastFinishedPulling="2026-03-18 16:47:20.480116781 +0000 UTC m=+168.195539770" observedRunningTime="2026-03-18 16:47:21.377687874 +0000 UTC m=+169.093110874" watchObservedRunningTime="2026-03-18 16:47:22.393994411 +0000 UTC m=+170.109417410" Mar 18 16:47:22.394259 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:22.394163 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-fq5mw" podStartSLOduration=1.026152473 podStartE2EDuration="3.394156835s" podCreationTimestamp="2026-03-18 16:47:19 +0000 UTC" firstStartedPulling="2026-03-18 16:47:19.80257605 +0000 UTC m=+167.517999026" lastFinishedPulling="2026-03-18 16:47:22.170580412 +0000 UTC m=+169.886003388" observedRunningTime="2026-03-18 16:47:22.393724762 +0000 UTC m=+170.109147760" watchObservedRunningTime="2026-03-18 16:47:22.394156835 +0000 UTC m=+170.109579834" Mar 18 16:47:22.886038 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:22.886001 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:47:22.890523 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:22.890501 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8k48n\"" Mar 18 16:47:22.896592 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:22.896572 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:47:23.044228 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:23.044191 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7776559f4-s949p"] Mar 18 16:47:23.048619 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:47:23.048587 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52134bb8_4757_4487_a5b9_38bdaea56506.slice/crio-56d72895fdcbe8ed65d06aca8e8e035e4c9fa72d53de90bbc69d3f1a377c5539 WatchSource:0}: Error finding container 56d72895fdcbe8ed65d06aca8e8e035e4c9fa72d53de90bbc69d3f1a377c5539: Status 404 returned error can't find the container with id 56d72895fdcbe8ed65d06aca8e8e035e4c9fa72d53de90bbc69d3f1a377c5539 Mar 18 16:47:23.347414 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:23.347380 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7776559f4-s949p" event={"ID":"52134bb8-4757-4487-a5b9-38bdaea56506","Type":"ContainerStarted","Data":"71fcd78b817aeea780fa8cfdc83fd793b875bfc172f0e5a1f0e16eb16d34722a"} Mar 18 16:47:23.347414 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:23.347418 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7776559f4-s949p" event={"ID":"52134bb8-4757-4487-a5b9-38bdaea56506","Type":"ContainerStarted","Data":"56d72895fdcbe8ed65d06aca8e8e035e4c9fa72d53de90bbc69d3f1a377c5539"} Mar 18 16:47:23.374998 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:23.374933 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7776559f4-s949p" podStartSLOduration=170.37491683 podStartE2EDuration="2m50.37491683s" podCreationTimestamp="2026-03-18 16:44:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:47:23.373117491 +0000 UTC m=+171.088540489" watchObservedRunningTime="2026-03-18 16:47:23.37491683 +0000 UTC m=+171.090339827" Mar 18 16:47:24.029029 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:24.027465 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-gnhf9"] Mar 18 16:47:24.033150 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:24.033115 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-gnhf9" Mar 18 16:47:24.038924 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:24.038897 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-k6b2v\"" Mar 18 16:47:24.039062 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:24.038934 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Mar 18 16:47:24.044091 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:24.044065 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-gnhf9"] Mar 18 16:47:24.139751 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:24.139706 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ba0971bb-c918-470a-aa1b-4c2fbcc0115a-tls-certificates\") pod \"prometheus-operator-admission-webhook-8444df798b-gnhf9\" (UID: \"ba0971bb-c918-470a-aa1b-4c2fbcc0115a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-gnhf9" Mar 18 16:47:24.240343 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:24.240301 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ba0971bb-c918-470a-aa1b-4c2fbcc0115a-tls-certificates\") pod \"prometheus-operator-admission-webhook-8444df798b-gnhf9\" (UID: \"ba0971bb-c918-470a-aa1b-4c2fbcc0115a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-gnhf9" Mar 18 16:47:24.242789 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:24.242755 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ba0971bb-c918-470a-aa1b-4c2fbcc0115a-tls-certificates\") pod \"prometheus-operator-admission-webhook-8444df798b-gnhf9\" (UID: \"ba0971bb-c918-470a-aa1b-4c2fbcc0115a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-gnhf9" Mar 18 16:47:24.342163 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:24.342072 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-gnhf9" Mar 18 16:47:24.350072 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:24.350022 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:47:24.460596 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:24.460566 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-gnhf9"] Mar 18 16:47:24.465353 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:47:24.465325 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba0971bb_c918_470a_aa1b_4c2fbcc0115a.slice/crio-b5c3114e3393954c08dc26bbf637c62777d7bcdc6fa25ecdd957b796838f34ed WatchSource:0}: Error finding container b5c3114e3393954c08dc26bbf637c62777d7bcdc6fa25ecdd957b796838f34ed: Status 404 returned error can't find the container with id b5c3114e3393954c08dc26bbf637c62777d7bcdc6fa25ecdd957b796838f34ed Mar 18 16:47:25.354065 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:25.354031 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-gnhf9" event={"ID":"ba0971bb-c918-470a-aa1b-4c2fbcc0115a","Type":"ContainerStarted","Data":"b5c3114e3393954c08dc26bbf637c62777d7bcdc6fa25ecdd957b796838f34ed"} Mar 18 16:47:26.357987 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:26.357928 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-gnhf9" event={"ID":"ba0971bb-c918-470a-aa1b-4c2fbcc0115a","Type":"ContainerStarted","Data":"92ddbf16b4d16d5e10bac8dd5ccffabb04652de579297577454925bd90135f13"} Mar 18 16:47:26.358391 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:26.358151 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-gnhf9" Mar 18 16:47:26.362886 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:26.362852 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-gnhf9" Mar 18 16:47:26.376068 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:26.376017 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-gnhf9" podStartSLOduration=2.25324405 podStartE2EDuration="3.376004375s" podCreationTimestamp="2026-03-18 16:47:23 +0000 UTC" firstStartedPulling="2026-03-18 16:47:24.46722838 +0000 UTC m=+172.182651356" lastFinishedPulling="2026-03-18 16:47:25.589988684 +0000 UTC m=+173.305411681" observedRunningTime="2026-03-18 16:47:26.374966341 +0000 UTC m=+174.090389336" watchObservedRunningTime="2026-03-18 16:47:26.376004375 +0000 UTC m=+174.091427373" Mar 18 16:47:27.044004 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:27.043968 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-6b948c769-n79l9"] Mar 18 16:47:27.046613 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:27.046594 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6b948c769-n79l9" Mar 18 16:47:27.049249 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:27.049227 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Mar 18 16:47:27.050395 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:27.050365 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Mar 18 16:47:27.050513 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:27.050394 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Mar 18 16:47:27.050513 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:27.050458 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-h9grd\"" Mar 18 16:47:27.050513 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:27.050506 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Mar 18 16:47:27.050634 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:27.050579 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Mar 18 16:47:27.057791 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:27.057764 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6b948c769-n79l9"] Mar 18 16:47:27.162210 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:27.162172 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3adf32b-df34-4e20-8641-d4af2dea277a-prometheus-operator-tls\") pod \"prometheus-operator-6b948c769-n79l9\" (UID: \"d3adf32b-df34-4e20-8641-d4af2dea277a\") " pod="openshift-monitoring/prometheus-operator-6b948c769-n79l9" Mar 18 16:47:27.162210 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:27.162218 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d3adf32b-df34-4e20-8641-d4af2dea277a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6b948c769-n79l9\" (UID: \"d3adf32b-df34-4e20-8641-d4af2dea277a\") " pod="openshift-monitoring/prometheus-operator-6b948c769-n79l9" Mar 18 16:47:27.162470 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:27.162329 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d3adf32b-df34-4e20-8641-d4af2dea277a-metrics-client-ca\") pod \"prometheus-operator-6b948c769-n79l9\" (UID: \"d3adf32b-df34-4e20-8641-d4af2dea277a\") " pod="openshift-monitoring/prometheus-operator-6b948c769-n79l9" Mar 18 16:47:27.162470 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:27.162387 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mbtp\" (UniqueName: \"kubernetes.io/projected/d3adf32b-df34-4e20-8641-d4af2dea277a-kube-api-access-4mbtp\") pod \"prometheus-operator-6b948c769-n79l9\" (UID: \"d3adf32b-df34-4e20-8641-d4af2dea277a\") " pod="openshift-monitoring/prometheus-operator-6b948c769-n79l9" Mar 18 16:47:27.263725 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:27.263686 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d3adf32b-df34-4e20-8641-d4af2dea277a-metrics-client-ca\") pod \"prometheus-operator-6b948c769-n79l9\" (UID: \"d3adf32b-df34-4e20-8641-d4af2dea277a\") " pod="openshift-monitoring/prometheus-operator-6b948c769-n79l9" Mar 18 16:47:27.263886 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:27.263745 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mbtp\" (UniqueName: \"kubernetes.io/projected/d3adf32b-df34-4e20-8641-d4af2dea277a-kube-api-access-4mbtp\") pod \"prometheus-operator-6b948c769-n79l9\" (UID: \"d3adf32b-df34-4e20-8641-d4af2dea277a\") " pod="openshift-monitoring/prometheus-operator-6b948c769-n79l9" Mar 18 16:47:27.263886 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:27.263791 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3adf32b-df34-4e20-8641-d4af2dea277a-prometheus-operator-tls\") pod \"prometheus-operator-6b948c769-n79l9\" (UID: \"d3adf32b-df34-4e20-8641-d4af2dea277a\") " pod="openshift-monitoring/prometheus-operator-6b948c769-n79l9" Mar 18 16:47:27.263886 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:27.263824 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d3adf32b-df34-4e20-8641-d4af2dea277a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6b948c769-n79l9\" (UID: \"d3adf32b-df34-4e20-8641-d4af2dea277a\") " pod="openshift-monitoring/prometheus-operator-6b948c769-n79l9" Mar 18 16:47:27.264072 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:47:27.263969 2573 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Mar 18 16:47:27.264072 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:47:27.264047 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3adf32b-df34-4e20-8641-d4af2dea277a-prometheus-operator-tls podName:d3adf32b-df34-4e20-8641-d4af2dea277a nodeName:}" failed. No retries permitted until 2026-03-18 16:47:27.764025094 +0000 UTC m=+175.479448081 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/d3adf32b-df34-4e20-8641-d4af2dea277a-prometheus-operator-tls") pod "prometheus-operator-6b948c769-n79l9" (UID: "d3adf32b-df34-4e20-8641-d4af2dea277a") : secret "prometheus-operator-tls" not found Mar 18 16:47:27.264446 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:27.264429 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d3adf32b-df34-4e20-8641-d4af2dea277a-metrics-client-ca\") pod \"prometheus-operator-6b948c769-n79l9\" (UID: \"d3adf32b-df34-4e20-8641-d4af2dea277a\") " pod="openshift-monitoring/prometheus-operator-6b948c769-n79l9" Mar 18 16:47:27.266464 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:27.266439 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d3adf32b-df34-4e20-8641-d4af2dea277a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6b948c769-n79l9\" (UID: \"d3adf32b-df34-4e20-8641-d4af2dea277a\") " pod="openshift-monitoring/prometheus-operator-6b948c769-n79l9" Mar 18 16:47:27.272885 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:27.272865 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mbtp\" (UniqueName: \"kubernetes.io/projected/d3adf32b-df34-4e20-8641-d4af2dea277a-kube-api-access-4mbtp\") pod \"prometheus-operator-6b948c769-n79l9\" (UID: \"d3adf32b-df34-4e20-8641-d4af2dea277a\") " pod="openshift-monitoring/prometheus-operator-6b948c769-n79l9" Mar 18 16:47:27.767518 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:27.767473 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3adf32b-df34-4e20-8641-d4af2dea277a-prometheus-operator-tls\") pod \"prometheus-operator-6b948c769-n79l9\" (UID: \"d3adf32b-df34-4e20-8641-d4af2dea277a\") " pod="openshift-monitoring/prometheus-operator-6b948c769-n79l9" Mar 18 16:47:27.769934 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:27.769906 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3adf32b-df34-4e20-8641-d4af2dea277a-prometheus-operator-tls\") pod \"prometheus-operator-6b948c769-n79l9\" (UID: \"d3adf32b-df34-4e20-8641-d4af2dea277a\") " pod="openshift-monitoring/prometheus-operator-6b948c769-n79l9" Mar 18 16:47:27.956389 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:27.956346 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6b948c769-n79l9" Mar 18 16:47:28.079014 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:28.078977 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6b948c769-n79l9"] Mar 18 16:47:28.085555 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:47:28.085522 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3adf32b_df34_4e20_8641_d4af2dea277a.slice/crio-93d21b63a2e7e9e0e6ea81d59418b168cb340cf9a2b26eef7a38a3833fc3a4bf WatchSource:0}: Error finding container 93d21b63a2e7e9e0e6ea81d59418b168cb340cf9a2b26eef7a38a3833fc3a4bf: Status 404 returned error can't find the container with id 93d21b63a2e7e9e0e6ea81d59418b168cb340cf9a2b26eef7a38a3833fc3a4bf Mar 18 16:47:28.363858 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:28.363764 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6b948c769-n79l9" event={"ID":"d3adf32b-df34-4e20-8641-d4af2dea277a","Type":"ContainerStarted","Data":"93d21b63a2e7e9e0e6ea81d59418b168cb340cf9a2b26eef7a38a3833fc3a4bf"} Mar 18 16:47:30.370258 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:30.370215 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6b948c769-n79l9" event={"ID":"d3adf32b-df34-4e20-8641-d4af2dea277a","Type":"ContainerStarted","Data":"6bf84ad39d2ea44eee0b32e3822d10a31ae085a9561227c3ce1289a530a7c23c"} Mar 18 16:47:30.370656 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:30.370265 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6b948c769-n79l9" event={"ID":"d3adf32b-df34-4e20-8641-d4af2dea277a","Type":"ContainerStarted","Data":"23d7b8978bd7fffa0db1ab974d55c44aee16dbbedd5767d577df13c075642bbe"} Mar 18 16:47:30.394291 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:30.394240 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-6b948c769-n79l9" podStartSLOduration=2.09122033 podStartE2EDuration="3.394226348s" podCreationTimestamp="2026-03-18 16:47:27 +0000 UTC" firstStartedPulling="2026-03-18 16:47:28.087439233 +0000 UTC m=+175.802862209" lastFinishedPulling="2026-03-18 16:47:29.390445233 +0000 UTC m=+177.105868227" observedRunningTime="2026-03-18 16:47:30.393440413 +0000 UTC m=+178.108863411" watchObservedRunningTime="2026-03-18 16:47:30.394226348 +0000 UTC m=+178.109649380" Mar 18 16:47:31.345635 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:31.345604 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-bxl54" Mar 18 16:47:36.249990 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.249925 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zkc96"] Mar 18 16:47:36.253032 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.253005 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.255590 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.255565 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Mar 18 16:47:36.255786 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.255604 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-g2m56\"" Mar 18 16:47:36.256006 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.255990 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Mar 18 16:47:36.256101 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.256012 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Mar 18 16:47:36.339518 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.339481 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6020e2e0-83e5-49b8-a158-e98d1e6697a8-sys\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.339709 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.339524 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6020e2e0-83e5-49b8-a158-e98d1e6697a8-node-exporter-accelerators-collector-config\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.339709 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.339551 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6020e2e0-83e5-49b8-a158-e98d1e6697a8-root\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.339709 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.339601 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6020e2e0-83e5-49b8-a158-e98d1e6697a8-node-exporter-textfile\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.339709 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.339685 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6020e2e0-83e5-49b8-a158-e98d1e6697a8-node-exporter-tls\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.339875 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.339723 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6020e2e0-83e5-49b8-a158-e98d1e6697a8-metrics-client-ca\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.339875 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.339753 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6020e2e0-83e5-49b8-a158-e98d1e6697a8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.339875 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.339787 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6020e2e0-83e5-49b8-a158-e98d1e6697a8-node-exporter-wtmp\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.339875 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.339842 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7ttb\" (UniqueName: \"kubernetes.io/projected/6020e2e0-83e5-49b8-a158-e98d1e6697a8-kube-api-access-m7ttb\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.440484 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.440446 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6020e2e0-83e5-49b8-a158-e98d1e6697a8-node-exporter-wtmp\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.440671 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.440507 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7ttb\" (UniqueName: \"kubernetes.io/projected/6020e2e0-83e5-49b8-a158-e98d1e6697a8-kube-api-access-m7ttb\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.440671 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.440569 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6020e2e0-83e5-49b8-a158-e98d1e6697a8-sys\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.440671 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.440600 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6020e2e0-83e5-49b8-a158-e98d1e6697a8-node-exporter-accelerators-collector-config\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.440671 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.440628 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6020e2e0-83e5-49b8-a158-e98d1e6697a8-root\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.440671 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.440652 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6020e2e0-83e5-49b8-a158-e98d1e6697a8-node-exporter-wtmp\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.440933 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.440766 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6020e2e0-83e5-49b8-a158-e98d1e6697a8-root\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.440933 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.440825 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6020e2e0-83e5-49b8-a158-e98d1e6697a8-sys\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.440933 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.440662 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6020e2e0-83e5-49b8-a158-e98d1e6697a8-node-exporter-textfile\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.440933 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.440907 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6020e2e0-83e5-49b8-a158-e98d1e6697a8-node-exporter-tls\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.441441 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.441419 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6020e2e0-83e5-49b8-a158-e98d1e6697a8-metrics-client-ca\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.441534 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.441455 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6020e2e0-83e5-49b8-a158-e98d1e6697a8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.441534 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.441030 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6020e2e0-83e5-49b8-a158-e98d1e6697a8-node-exporter-textfile\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.441534 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.441302 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6020e2e0-83e5-49b8-a158-e98d1e6697a8-node-exporter-accelerators-collector-config\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.441534 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:47:36.441089 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Mar 18 16:47:36.441734 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:47:36.441582 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6020e2e0-83e5-49b8-a158-e98d1e6697a8-node-exporter-tls podName:6020e2e0-83e5-49b8-a158-e98d1e6697a8 nodeName:}" failed. No retries permitted until 2026-03-18 16:47:36.941562524 +0000 UTC m=+184.656985511 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/6020e2e0-83e5-49b8-a158-e98d1e6697a8-node-exporter-tls") pod "node-exporter-zkc96" (UID: "6020e2e0-83e5-49b8-a158-e98d1e6697a8") : secret "node-exporter-tls" not found Mar 18 16:47:36.442034 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.442011 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6020e2e0-83e5-49b8-a158-e98d1e6697a8-metrics-client-ca\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.444364 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.444334 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6020e2e0-83e5-49b8-a158-e98d1e6697a8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.449494 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.449462 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7ttb\" (UniqueName: \"kubernetes.io/projected/6020e2e0-83e5-49b8-a158-e98d1e6697a8-kube-api-access-m7ttb\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.945198 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.945139 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6020e2e0-83e5-49b8-a158-e98d1e6697a8-node-exporter-tls\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:36.947596 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:36.947569 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6020e2e0-83e5-49b8-a158-e98d1e6697a8-node-exporter-tls\") pod \"node-exporter-zkc96\" (UID: \"6020e2e0-83e5-49b8-a158-e98d1e6697a8\") " pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:37.163372 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:37.163334 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zkc96" Mar 18 16:47:37.173600 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:47:37.173568 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6020e2e0_83e5_49b8_a158_e98d1e6697a8.slice/crio-b0bb9c94b5f2bdf7584a4a40dafd758a6b0de5d6ff9a0337b79db558accf632a WatchSource:0}: Error finding container b0bb9c94b5f2bdf7584a4a40dafd758a6b0de5d6ff9a0337b79db558accf632a: Status 404 returned error can't find the container with id b0bb9c94b5f2bdf7584a4a40dafd758a6b0de5d6ff9a0337b79db558accf632a Mar 18 16:47:37.389211 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:37.389121 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zkc96" event={"ID":"6020e2e0-83e5-49b8-a158-e98d1e6697a8","Type":"ContainerStarted","Data":"b0bb9c94b5f2bdf7584a4a40dafd758a6b0de5d6ff9a0337b79db558accf632a"} Mar 18 16:47:38.182074 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.182036 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-67fdbb775-kzlfs"] Mar 18 16:47:38.186819 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.186792 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.189446 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.189422 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Mar 18 16:47:38.189446 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.189423 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Mar 18 16:47:38.189684 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.189490 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-c75vg\"" Mar 18 16:47:38.189684 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.189505 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Mar 18 16:47:38.189684 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.189505 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Mar 18 16:47:38.189684 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.189490 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Mar 18 16:47:38.190285 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.190272 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-8rm9j0hs93pio\"" Mar 18 16:47:38.194024 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.194000 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-67fdbb775-kzlfs"] Mar 18 16:47:38.356732 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.356643 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2386015b-9e91-4f64-b14e-90352af8aad6-secret-grpc-tls\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.356732 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.356682 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64sgr\" (UniqueName: \"kubernetes.io/projected/2386015b-9e91-4f64-b14e-90352af8aad6-kube-api-access-64sgr\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.356732 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.356715 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/2386015b-9e91-4f64-b14e-90352af8aad6-secret-thanos-querier-tls\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.356976 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.356736 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2386015b-9e91-4f64-b14e-90352af8aad6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.356976 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.356756 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/2386015b-9e91-4f64-b14e-90352af8aad6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.356976 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.356775 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2386015b-9e91-4f64-b14e-90352af8aad6-metrics-client-ca\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.356976 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.356884 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/2386015b-9e91-4f64-b14e-90352af8aad6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.356976 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.356915 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2386015b-9e91-4f64-b14e-90352af8aad6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.393537 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.393502 2573 generic.go:358] "Generic (PLEG): container finished" podID="6020e2e0-83e5-49b8-a158-e98d1e6697a8" containerID="70102151b5856732da30136f595be193da65e36ba473723adca0c11b44ba0800" exitCode=0 Mar 18 16:47:38.393900 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.393593 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zkc96" event={"ID":"6020e2e0-83e5-49b8-a158-e98d1e6697a8","Type":"ContainerDied","Data":"70102151b5856732da30136f595be193da65e36ba473723adca0c11b44ba0800"} Mar 18 16:47:38.457504 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.457380 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/2386015b-9e91-4f64-b14e-90352af8aad6-secret-thanos-querier-tls\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.457504 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.457420 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2386015b-9e91-4f64-b14e-90352af8aad6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.457504 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.457443 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/2386015b-9e91-4f64-b14e-90352af8aad6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.457504 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.457485 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2386015b-9e91-4f64-b14e-90352af8aad6-metrics-client-ca\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.457699 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.457557 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/2386015b-9e91-4f64-b14e-90352af8aad6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.457699 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.457590 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2386015b-9e91-4f64-b14e-90352af8aad6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.457699 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.457662 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2386015b-9e91-4f64-b14e-90352af8aad6-secret-grpc-tls\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.457699 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.457692 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64sgr\" (UniqueName: \"kubernetes.io/projected/2386015b-9e91-4f64-b14e-90352af8aad6-kube-api-access-64sgr\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.460086 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.458805 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2386015b-9e91-4f64-b14e-90352af8aad6-metrics-client-ca\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.461151 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.460956 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2386015b-9e91-4f64-b14e-90352af8aad6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.461151 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.461138 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/2386015b-9e91-4f64-b14e-90352af8aad6-secret-thanos-querier-tls\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.461339 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.461175 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/2386015b-9e91-4f64-b14e-90352af8aad6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.461563 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.461539 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2386015b-9e91-4f64-b14e-90352af8aad6-secret-grpc-tls\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.461677 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.461574 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2386015b-9e91-4f64-b14e-90352af8aad6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.461677 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.461645 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/2386015b-9e91-4f64-b14e-90352af8aad6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.467380 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.467349 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64sgr\" (UniqueName: \"kubernetes.io/projected/2386015b-9e91-4f64-b14e-90352af8aad6-kube-api-access-64sgr\") pod \"thanos-querier-67fdbb775-kzlfs\" (UID: \"2386015b-9e91-4f64-b14e-90352af8aad6\") " pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.517174 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.517146 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:38.646596 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:38.646493 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-67fdbb775-kzlfs"] Mar 18 16:47:38.650691 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:47:38.650655 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2386015b_9e91_4f64_b14e_90352af8aad6.slice/crio-83fd680689f8fd0da437e7f2026201731003437413a9506d02019f26a541a115 WatchSource:0}: Error finding container 83fd680689f8fd0da437e7f2026201731003437413a9506d02019f26a541a115: Status 404 returned error can't find the container with id 83fd680689f8fd0da437e7f2026201731003437413a9506d02019f26a541a115 Mar 18 16:47:39.399761 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:39.399722 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zkc96" event={"ID":"6020e2e0-83e5-49b8-a158-e98d1e6697a8","Type":"ContainerStarted","Data":"1ceae7e31142fc90a0250b96f5d6b72b092c576c222a333a445e34156fa5c054"} Mar 18 16:47:39.400243 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:39.399770 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zkc96" event={"ID":"6020e2e0-83e5-49b8-a158-e98d1e6697a8","Type":"ContainerStarted","Data":"f1afad5e405d85309976cda88494cbecd9f7781799295d7fe8134f424789bed2"} Mar 18 16:47:39.401291 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:39.401264 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" event={"ID":"2386015b-9e91-4f64-b14e-90352af8aad6","Type":"ContainerStarted","Data":"83fd680689f8fd0da437e7f2026201731003437413a9506d02019f26a541a115"} Mar 18 16:47:39.420902 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:39.420850 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zkc96" podStartSLOduration=2.628380075 podStartE2EDuration="3.420833872s" podCreationTimestamp="2026-03-18 16:47:36 +0000 UTC" firstStartedPulling="2026-03-18 16:47:37.175423331 +0000 UTC m=+184.890846308" lastFinishedPulling="2026-03-18 16:47:37.967877114 +0000 UTC m=+185.683300105" observedRunningTime="2026-03-18 16:47:39.419428524 +0000 UTC m=+187.134851534" watchObservedRunningTime="2026-03-18 16:47:39.420833872 +0000 UTC m=+187.136256869" Mar 18 16:47:41.410288 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:41.410258 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" event={"ID":"2386015b-9e91-4f64-b14e-90352af8aad6","Type":"ContainerStarted","Data":"410e17c0a1023eeabf3a4c75b7ac52e06e77f54a8d2d98cac4847778d9627021"} Mar 18 16:47:41.410604 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:41.410298 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" event={"ID":"2386015b-9e91-4f64-b14e-90352af8aad6","Type":"ContainerStarted","Data":"7a1f4671a7184999f42c15bc574e3499885596e57ab3c7f54dd2355fa7bc21b3"} Mar 18 16:47:41.410604 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:41.410311 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" event={"ID":"2386015b-9e91-4f64-b14e-90352af8aad6","Type":"ContainerStarted","Data":"88bb2ad20823f2ed698a06a07f8075f2390a022cfe5d5d82d764e0a48b4f497a"} Mar 18 16:47:41.528881 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:41.528854 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7776559f4-s949p"] Mar 18 16:47:41.532758 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:41.532731 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:47:42.416460 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:42.416417 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" event={"ID":"2386015b-9e91-4f64-b14e-90352af8aad6","Type":"ContainerStarted","Data":"78cc78c999c1cfef5735c7956d212d51ca0b5e0322c7ed607b819325788e695c"} Mar 18 16:47:42.416460 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:42.416468 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" event={"ID":"2386015b-9e91-4f64-b14e-90352af8aad6","Type":"ContainerStarted","Data":"23eb30eaf438d44e5d23bdd4642977057d4c6a6699eb824a9ebd60ab5be67322"} Mar 18 16:47:42.417022 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:42.416484 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" event={"ID":"2386015b-9e91-4f64-b14e-90352af8aad6","Type":"ContainerStarted","Data":"1239700494fd52b9c1ab155fc56558ef9cdc9f66c9534ba10dd03855a402eda0"} Mar 18 16:47:42.417272 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:42.417227 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:47:42.450160 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:42.450087 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" podStartSLOduration=1.755887388 podStartE2EDuration="4.450064329s" podCreationTimestamp="2026-03-18 16:47:38 +0000 UTC" firstStartedPulling="2026-03-18 16:47:38.652533868 +0000 UTC m=+186.367956845" lastFinishedPulling="2026-03-18 16:47:41.346710806 +0000 UTC m=+189.062133786" observedRunningTime="2026-03-18 16:47:42.44732465 +0000 UTC m=+190.162747647" watchObservedRunningTime="2026-03-18 16:47:42.450064329 +0000 UTC m=+190.165487328" Mar 18 16:47:45.772661 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:45.772625 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-5b85974fd6-95vqb"] Mar 18 16:47:45.774949 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:45.774920 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-5b85974fd6-95vqb" Mar 18 16:47:45.777542 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:45.777521 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-6thct\"" Mar 18 16:47:45.777650 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:45.777540 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Mar 18 16:47:45.777650 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:45.777561 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Mar 18 16:47:45.784528 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:45.784506 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-5b85974fd6-95vqb"] Mar 18 16:47:45.918656 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:45.918614 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfqrs\" (UniqueName: \"kubernetes.io/projected/0d492655-5985-4c64-b74b-d3d031ea8e6c-kube-api-access-pfqrs\") pod \"downloads-5b85974fd6-95vqb\" (UID: \"0d492655-5985-4c64-b74b-d3d031ea8e6c\") " pod="openshift-console/downloads-5b85974fd6-95vqb" Mar 18 16:47:46.019142 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:46.019094 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pfqrs\" (UniqueName: \"kubernetes.io/projected/0d492655-5985-4c64-b74b-d3d031ea8e6c-kube-api-access-pfqrs\") pod \"downloads-5b85974fd6-95vqb\" (UID: \"0d492655-5985-4c64-b74b-d3d031ea8e6c\") " pod="openshift-console/downloads-5b85974fd6-95vqb" Mar 18 16:47:46.030434 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:46.030361 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfqrs\" (UniqueName: \"kubernetes.io/projected/0d492655-5985-4c64-b74b-d3d031ea8e6c-kube-api-access-pfqrs\") pod \"downloads-5b85974fd6-95vqb\" (UID: \"0d492655-5985-4c64-b74b-d3d031ea8e6c\") " pod="openshift-console/downloads-5b85974fd6-95vqb" Mar 18 16:47:46.084200 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:46.084159 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-5b85974fd6-95vqb" Mar 18 16:47:46.215836 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:46.215797 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-5b85974fd6-95vqb"] Mar 18 16:47:46.219416 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:47:46.219383 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d492655_5985_4c64_b74b_d3d031ea8e6c.slice/crio-fe56a3d791ad5e56f72f0bbfeee75478e841890de92f3f0f0dc1337005bc40b2 WatchSource:0}: Error finding container fe56a3d791ad5e56f72f0bbfeee75478e841890de92f3f0f0dc1337005bc40b2: Status 404 returned error can't find the container with id fe56a3d791ad5e56f72f0bbfeee75478e841890de92f3f0f0dc1337005bc40b2 Mar 18 16:47:46.432484 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:46.432453 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-5b85974fd6-95vqb" event={"ID":"0d492655-5985-4c64-b74b-d3d031ea8e6c","Type":"ContainerStarted","Data":"fe56a3d791ad5e56f72f0bbfeee75478e841890de92f3f0f0dc1337005bc40b2"} Mar 18 16:47:48.431802 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:47:48.431771 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-67fdbb775-kzlfs" Mar 18 16:48:02.482510 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:02.482468 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-5b85974fd6-95vqb" event={"ID":"0d492655-5985-4c64-b74b-d3d031ea8e6c","Type":"ContainerStarted","Data":"9c67086ae1db27841ff32d362f748982d64d77f5de5c23c6f924924fc384a397"} Mar 18 16:48:02.483056 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:02.482635 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-5b85974fd6-95vqb" Mar 18 16:48:02.488287 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:02.488255 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-5b85974fd6-95vqb" Mar 18 16:48:02.504869 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:02.504802 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-5b85974fd6-95vqb" podStartSLOduration=1.7928038160000002 podStartE2EDuration="17.504784419s" podCreationTimestamp="2026-03-18 16:47:45 +0000 UTC" firstStartedPulling="2026-03-18 16:47:46.221376828 +0000 UTC m=+193.936799804" lastFinishedPulling="2026-03-18 16:48:01.933357427 +0000 UTC m=+209.648780407" observedRunningTime="2026-03-18 16:48:02.503573943 +0000 UTC m=+210.218996938" watchObservedRunningTime="2026-03-18 16:48:02.504784419 +0000 UTC m=+210.220207418" Mar 18 16:48:06.547352 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.547286 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7776559f4-s949p" podUID="52134bb8-4757-4487-a5b9-38bdaea56506" containerName="registry" containerID="cri-o://71fcd78b817aeea780fa8cfdc83fd793b875bfc172f0e5a1f0e16eb16d34722a" gracePeriod=30 Mar 18 16:48:06.786247 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.786218 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:48:06.892448 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.892370 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52134bb8-4757-4487-a5b9-38bdaea56506-trusted-ca\") pod \"52134bb8-4757-4487-a5b9-38bdaea56506\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " Mar 18 16:48:06.892448 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.892411 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-bound-sa-token\") pod \"52134bb8-4757-4487-a5b9-38bdaea56506\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " Mar 18 16:48:06.892448 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.892445 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhxzr\" (UniqueName: \"kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-kube-api-access-lhxzr\") pod \"52134bb8-4757-4487-a5b9-38bdaea56506\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " Mar 18 16:48:06.892760 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.892465 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/52134bb8-4757-4487-a5b9-38bdaea56506-image-registry-private-configuration\") pod \"52134bb8-4757-4487-a5b9-38bdaea56506\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " Mar 18 16:48:06.892760 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.892485 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls\") pod \"52134bb8-4757-4487-a5b9-38bdaea56506\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " Mar 18 16:48:06.892760 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.892510 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/52134bb8-4757-4487-a5b9-38bdaea56506-installation-pull-secrets\") pod \"52134bb8-4757-4487-a5b9-38bdaea56506\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " Mar 18 16:48:06.892760 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.892699 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/52134bb8-4757-4487-a5b9-38bdaea56506-ca-trust-extracted\") pod \"52134bb8-4757-4487-a5b9-38bdaea56506\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " Mar 18 16:48:06.892760 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.892759 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/52134bb8-4757-4487-a5b9-38bdaea56506-registry-certificates\") pod \"52134bb8-4757-4487-a5b9-38bdaea56506\" (UID: \"52134bb8-4757-4487-a5b9-38bdaea56506\") " Mar 18 16:48:06.893035 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.892798 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52134bb8-4757-4487-a5b9-38bdaea56506-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "52134bb8-4757-4487-a5b9-38bdaea56506" (UID: "52134bb8-4757-4487-a5b9-38bdaea56506"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:06.893035 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.893028 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52134bb8-4757-4487-a5b9-38bdaea56506-trusted-ca\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:48:06.893345 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.893316 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52134bb8-4757-4487-a5b9-38bdaea56506-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "52134bb8-4757-4487-a5b9-38bdaea56506" (UID: "52134bb8-4757-4487-a5b9-38bdaea56506"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:06.895177 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.895122 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "52134bb8-4757-4487-a5b9-38bdaea56506" (UID: "52134bb8-4757-4487-a5b9-38bdaea56506"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:48:06.895370 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.895348 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52134bb8-4757-4487-a5b9-38bdaea56506-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "52134bb8-4757-4487-a5b9-38bdaea56506" (UID: "52134bb8-4757-4487-a5b9-38bdaea56506"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:06.895437 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.895416 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-kube-api-access-lhxzr" (OuterVolumeSpecName: "kube-api-access-lhxzr") pod "52134bb8-4757-4487-a5b9-38bdaea56506" (UID: "52134bb8-4757-4487-a5b9-38bdaea56506"). InnerVolumeSpecName "kube-api-access-lhxzr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:48:06.895477 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.895434 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52134bb8-4757-4487-a5b9-38bdaea56506-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "52134bb8-4757-4487-a5b9-38bdaea56506" (UID: "52134bb8-4757-4487-a5b9-38bdaea56506"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:06.895520 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.895477 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "52134bb8-4757-4487-a5b9-38bdaea56506" (UID: "52134bb8-4757-4487-a5b9-38bdaea56506"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:48:06.901291 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.901268 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52134bb8-4757-4487-a5b9-38bdaea56506-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "52134bb8-4757-4487-a5b9-38bdaea56506" (UID: "52134bb8-4757-4487-a5b9-38bdaea56506"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:48:06.994151 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.994112 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/52134bb8-4757-4487-a5b9-38bdaea56506-ca-trust-extracted\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:48:06.994151 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.994143 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/52134bb8-4757-4487-a5b9-38bdaea56506-registry-certificates\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:48:06.994151 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.994155 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-bound-sa-token\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:48:06.994381 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.994165 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lhxzr\" (UniqueName: \"kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-kube-api-access-lhxzr\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:48:06.994381 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.994174 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/52134bb8-4757-4487-a5b9-38bdaea56506-image-registry-private-configuration\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:48:06.994381 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.994184 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/52134bb8-4757-4487-a5b9-38bdaea56506-registry-tls\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:48:06.994381 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:06.994193 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/52134bb8-4757-4487-a5b9-38bdaea56506-installation-pull-secrets\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:48:07.496794 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:07.496757 2573 generic.go:358] "Generic (PLEG): container finished" podID="52134bb8-4757-4487-a5b9-38bdaea56506" containerID="71fcd78b817aeea780fa8cfdc83fd793b875bfc172f0e5a1f0e16eb16d34722a" exitCode=0 Mar 18 16:48:07.496996 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:07.496792 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7776559f4-s949p" event={"ID":"52134bb8-4757-4487-a5b9-38bdaea56506","Type":"ContainerDied","Data":"71fcd78b817aeea780fa8cfdc83fd793b875bfc172f0e5a1f0e16eb16d34722a"} Mar 18 16:48:07.496996 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:07.496826 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7776559f4-s949p" Mar 18 16:48:07.496996 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:07.496841 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7776559f4-s949p" event={"ID":"52134bb8-4757-4487-a5b9-38bdaea56506","Type":"ContainerDied","Data":"56d72895fdcbe8ed65d06aca8e8e035e4c9fa72d53de90bbc69d3f1a377c5539"} Mar 18 16:48:07.496996 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:07.496861 2573 scope.go:117] "RemoveContainer" containerID="71fcd78b817aeea780fa8cfdc83fd793b875bfc172f0e5a1f0e16eb16d34722a" Mar 18 16:48:07.506231 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:07.506214 2573 scope.go:117] "RemoveContainer" containerID="71fcd78b817aeea780fa8cfdc83fd793b875bfc172f0e5a1f0e16eb16d34722a" Mar 18 16:48:07.506489 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:48:07.506469 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71fcd78b817aeea780fa8cfdc83fd793b875bfc172f0e5a1f0e16eb16d34722a\": container with ID starting with 71fcd78b817aeea780fa8cfdc83fd793b875bfc172f0e5a1f0e16eb16d34722a not found: ID does not exist" containerID="71fcd78b817aeea780fa8cfdc83fd793b875bfc172f0e5a1f0e16eb16d34722a" Mar 18 16:48:07.506609 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:07.506503 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71fcd78b817aeea780fa8cfdc83fd793b875bfc172f0e5a1f0e16eb16d34722a"} err="failed to get container status \"71fcd78b817aeea780fa8cfdc83fd793b875bfc172f0e5a1f0e16eb16d34722a\": rpc error: code = NotFound desc = could not find container \"71fcd78b817aeea780fa8cfdc83fd793b875bfc172f0e5a1f0e16eb16d34722a\": container with ID starting with 71fcd78b817aeea780fa8cfdc83fd793b875bfc172f0e5a1f0e16eb16d34722a not found: ID does not exist" Mar 18 16:48:07.521625 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:07.521595 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7776559f4-s949p"] Mar 18 16:48:07.524967 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:07.524923 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7776559f4-s949p"] Mar 18 16:48:08.502011 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:08.501977 2573 generic.go:358] "Generic (PLEG): container finished" podID="b5757757-50ec-49cc-93fb-20785bb506cb" containerID="08cd55cb452215c2d45af4ddfbb32764f16b631a40f01f332c8df13253f2ee9f" exitCode=0 Mar 18 16:48:08.502418 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:08.502025 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-l4zpq" event={"ID":"b5757757-50ec-49cc-93fb-20785bb506cb","Type":"ContainerDied","Data":"08cd55cb452215c2d45af4ddfbb32764f16b631a40f01f332c8df13253f2ee9f"} Mar 18 16:48:08.502418 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:08.502352 2573 scope.go:117] "RemoveContainer" containerID="08cd55cb452215c2d45af4ddfbb32764f16b631a40f01f332c8df13253f2ee9f" Mar 18 16:48:08.887425 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:08.887392 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52134bb8-4757-4487-a5b9-38bdaea56506" path="/var/lib/kubelet/pods/52134bb8-4757-4487-a5b9-38bdaea56506/volumes" Mar 18 16:48:09.507762 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:09.507726 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-l4zpq" event={"ID":"b5757757-50ec-49cc-93fb-20785bb506cb","Type":"ContainerStarted","Data":"975e6dc90987a473521cf12481ace6f0f2c2bfd43b52beacdb75d351e8d6ccd1"} Mar 18 16:48:18.532930 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:18.532891 2573 generic.go:358] "Generic (PLEG): container finished" podID="847a4bfe-1ce5-4bcc-bd8c-f62cb630b993" containerID="e66d20c2d00d2536a7379774712fcdeb3681efbc043c7e1ba3a50ef3c014195f" exitCode=0 Mar 18 16:48:18.532930 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:18.532933 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xp98c" event={"ID":"847a4bfe-1ce5-4bcc-bd8c-f62cb630b993","Type":"ContainerDied","Data":"e66d20c2d00d2536a7379774712fcdeb3681efbc043c7e1ba3a50ef3c014195f"} Mar 18 16:48:18.533404 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:18.533269 2573 scope.go:117] "RemoveContainer" containerID="e66d20c2d00d2536a7379774712fcdeb3681efbc043c7e1ba3a50ef3c014195f" Mar 18 16:48:19.537503 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:19.537467 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xp98c" event={"ID":"847a4bfe-1ce5-4bcc-bd8c-f62cb630b993","Type":"ContainerStarted","Data":"2b65207109fce96422104ad61ec0fa2b0a0b6ff0b53fe0051967f7806397475a"} Mar 18 16:48:44.705023 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:44.704974 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs\") pod \"network-metrics-daemon-wtbdl\" (UID: \"52f2a3f3-56d7-41f7-8bed-9e7229d96408\") " pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:48:44.707260 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:44.707238 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52f2a3f3-56d7-41f7-8bed-9e7229d96408-metrics-certs\") pod \"network-metrics-daemon-wtbdl\" (UID: \"52f2a3f3-56d7-41f7-8bed-9e7229d96408\") " pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:48:44.988368 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:44.988272 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-w8h72\"" Mar 18 16:48:44.995888 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:44.995862 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtbdl" Mar 18 16:48:45.118201 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:45.118169 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wtbdl"] Mar 18 16:48:45.121333 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:48:45.121298 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52f2a3f3_56d7_41f7_8bed_9e7229d96408.slice/crio-2ba0c95d4955cbf87097c747c69228b86ac16db4840302574032425cdb0f83ca WatchSource:0}: Error finding container 2ba0c95d4955cbf87097c747c69228b86ac16db4840302574032425cdb0f83ca: Status 404 returned error can't find the container with id 2ba0c95d4955cbf87097c747c69228b86ac16db4840302574032425cdb0f83ca Mar 18 16:48:45.607828 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:45.607787 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wtbdl" event={"ID":"52f2a3f3-56d7-41f7-8bed-9e7229d96408","Type":"ContainerStarted","Data":"2ba0c95d4955cbf87097c747c69228b86ac16db4840302574032425cdb0f83ca"} Mar 18 16:48:46.612992 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:46.612875 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wtbdl" event={"ID":"52f2a3f3-56d7-41f7-8bed-9e7229d96408","Type":"ContainerStarted","Data":"b784d4ca9d1f97baf3c602b1048ce7d70bfc4f594d5c2eb0dbf162ac6d886406"} Mar 18 16:48:46.612992 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:46.612914 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wtbdl" event={"ID":"52f2a3f3-56d7-41f7-8bed-9e7229d96408","Type":"ContainerStarted","Data":"c60f5d96a3fa395d274807acb3002cec4589fdc3bc6d74e8fa9eacc98322c435"} Mar 18 16:48:46.640522 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:48:46.640467 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wtbdl" podStartSLOduration=252.428949233 podStartE2EDuration="4m13.640448575s" podCreationTimestamp="2026-03-18 16:44:33 +0000 UTC" firstStartedPulling="2026-03-18 16:48:45.123309235 +0000 UTC m=+252.838732211" lastFinishedPulling="2026-03-18 16:48:46.334808572 +0000 UTC m=+254.050231553" observedRunningTime="2026-03-18 16:48:46.640172321 +0000 UTC m=+254.355595319" watchObservedRunningTime="2026-03-18 16:48:46.640448575 +0000 UTC m=+254.355871573" Mar 18 16:49:32.758398 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:49:32.758368 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qzdnc_c6018dac-2f72-43d9-b554-dffe8cf976c4/ovn-acl-logging/0.log" Mar 18 16:49:32.759405 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:49:32.759383 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qzdnc_c6018dac-2f72-43d9-b554-dffe8cf976c4/ovn-acl-logging/0.log" Mar 18 16:49:32.763577 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:49:32.763553 2573 kubelet.go:1628] "Image garbage collection succeeded" Mar 18 16:51:12.099124 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.099084 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9bdd46fb7-pwjkv"] Mar 18 16:51:12.099549 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.099408 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52134bb8-4757-4487-a5b9-38bdaea56506" containerName="registry" Mar 18 16:51:12.099549 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.099420 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="52134bb8-4757-4487-a5b9-38bdaea56506" containerName="registry" Mar 18 16:51:12.099549 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.099463 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="52134bb8-4757-4487-a5b9-38bdaea56506" containerName="registry" Mar 18 16:51:12.102302 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.102285 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9bdd46fb7-pwjkv" Mar 18 16:51:12.104675 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.104649 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Mar 18 16:51:12.104823 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.104686 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Mar 18 16:51:12.105980 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.105962 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Mar 18 16:51:12.106036 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.105966 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Mar 18 16:51:12.106036 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.106005 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-lfpfc\"" Mar 18 16:51:12.111157 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.111132 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9bdd46fb7-pwjkv"] Mar 18 16:51:12.140815 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.140772 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-fb9569db9-fc9qp"] Mar 18 16:51:12.144004 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.143979 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fb9569db9-fc9qp" Mar 18 16:51:12.146789 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.146765 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Mar 18 16:51:12.153766 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.153739 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-fb9569db9-fc9qp"] Mar 18 16:51:12.198719 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.198680 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2b6c4a0a-92ca-4005-9d1c-9c856990fce1-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-9bdd46fb7-pwjkv\" (UID: \"2b6c4a0a-92ca-4005-9d1c-9c856990fce1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9bdd46fb7-pwjkv" Mar 18 16:51:12.198905 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.198733 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjbn4\" (UniqueName: \"kubernetes.io/projected/2b6c4a0a-92ca-4005-9d1c-9c856990fce1-kube-api-access-rjbn4\") pod \"managed-serviceaccount-addon-agent-9bdd46fb7-pwjkv\" (UID: \"2b6c4a0a-92ca-4005-9d1c-9c856990fce1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9bdd46fb7-pwjkv" Mar 18 16:51:12.198905 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.198766 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js7q4\" (UniqueName: \"kubernetes.io/projected/248b934e-a7ba-4aca-aa9e-58fd8fb76b02-kube-api-access-js7q4\") pod \"klusterlet-addon-workmgr-fb9569db9-fc9qp\" (UID: \"248b934e-a7ba-4aca-aa9e-58fd8fb76b02\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fb9569db9-fc9qp" Mar 18 16:51:12.198905 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.198784 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/248b934e-a7ba-4aca-aa9e-58fd8fb76b02-tmp\") pod \"klusterlet-addon-workmgr-fb9569db9-fc9qp\" (UID: \"248b934e-a7ba-4aca-aa9e-58fd8fb76b02\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fb9569db9-fc9qp" Mar 18 16:51:12.198905 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.198829 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/248b934e-a7ba-4aca-aa9e-58fd8fb76b02-klusterlet-config\") pod \"klusterlet-addon-workmgr-fb9569db9-fc9qp\" (UID: \"248b934e-a7ba-4aca-aa9e-58fd8fb76b02\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fb9569db9-fc9qp" Mar 18 16:51:12.300164 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.300129 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjbn4\" (UniqueName: \"kubernetes.io/projected/2b6c4a0a-92ca-4005-9d1c-9c856990fce1-kube-api-access-rjbn4\") pod \"managed-serviceaccount-addon-agent-9bdd46fb7-pwjkv\" (UID: \"2b6c4a0a-92ca-4005-9d1c-9c856990fce1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9bdd46fb7-pwjkv" Mar 18 16:51:12.300292 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.300171 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-js7q4\" (UniqueName: \"kubernetes.io/projected/248b934e-a7ba-4aca-aa9e-58fd8fb76b02-kube-api-access-js7q4\") pod \"klusterlet-addon-workmgr-fb9569db9-fc9qp\" (UID: \"248b934e-a7ba-4aca-aa9e-58fd8fb76b02\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fb9569db9-fc9qp" Mar 18 16:51:12.300292 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.300191 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/248b934e-a7ba-4aca-aa9e-58fd8fb76b02-tmp\") pod \"klusterlet-addon-workmgr-fb9569db9-fc9qp\" (UID: \"248b934e-a7ba-4aca-aa9e-58fd8fb76b02\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fb9569db9-fc9qp" Mar 18 16:51:12.300292 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.300225 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/248b934e-a7ba-4aca-aa9e-58fd8fb76b02-klusterlet-config\") pod \"klusterlet-addon-workmgr-fb9569db9-fc9qp\" (UID: \"248b934e-a7ba-4aca-aa9e-58fd8fb76b02\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fb9569db9-fc9qp" Mar 18 16:51:12.300292 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.300264 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2b6c4a0a-92ca-4005-9d1c-9c856990fce1-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-9bdd46fb7-pwjkv\" (UID: \"2b6c4a0a-92ca-4005-9d1c-9c856990fce1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9bdd46fb7-pwjkv" Mar 18 16:51:12.300709 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.300686 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/248b934e-a7ba-4aca-aa9e-58fd8fb76b02-tmp\") pod \"klusterlet-addon-workmgr-fb9569db9-fc9qp\" (UID: \"248b934e-a7ba-4aca-aa9e-58fd8fb76b02\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fb9569db9-fc9qp" Mar 18 16:51:12.302798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.302769 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/248b934e-a7ba-4aca-aa9e-58fd8fb76b02-klusterlet-config\") pod \"klusterlet-addon-workmgr-fb9569db9-fc9qp\" (UID: \"248b934e-a7ba-4aca-aa9e-58fd8fb76b02\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fb9569db9-fc9qp" Mar 18 16:51:12.302884 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.302872 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2b6c4a0a-92ca-4005-9d1c-9c856990fce1-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-9bdd46fb7-pwjkv\" (UID: \"2b6c4a0a-92ca-4005-9d1c-9c856990fce1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9bdd46fb7-pwjkv" Mar 18 16:51:12.308496 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.308471 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjbn4\" (UniqueName: \"kubernetes.io/projected/2b6c4a0a-92ca-4005-9d1c-9c856990fce1-kube-api-access-rjbn4\") pod \"managed-serviceaccount-addon-agent-9bdd46fb7-pwjkv\" (UID: \"2b6c4a0a-92ca-4005-9d1c-9c856990fce1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9bdd46fb7-pwjkv" Mar 18 16:51:12.308606 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.308531 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-js7q4\" (UniqueName: \"kubernetes.io/projected/248b934e-a7ba-4aca-aa9e-58fd8fb76b02-kube-api-access-js7q4\") pod \"klusterlet-addon-workmgr-fb9569db9-fc9qp\" (UID: \"248b934e-a7ba-4aca-aa9e-58fd8fb76b02\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fb9569db9-fc9qp" Mar 18 16:51:12.423346 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.423257 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9bdd46fb7-pwjkv" Mar 18 16:51:12.454095 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.454058 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fb9569db9-fc9qp" Mar 18 16:51:12.556436 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.556396 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9bdd46fb7-pwjkv"] Mar 18 16:51:12.560595 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:51:12.560564 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b6c4a0a_92ca_4005_9d1c_9c856990fce1.slice/crio-6f6b0f2932262a3851340c0c6f3f75bbc06d65bbccb42f60586f34850024833d WatchSource:0}: Error finding container 6f6b0f2932262a3851340c0c6f3f75bbc06d65bbccb42f60586f34850024833d: Status 404 returned error can't find the container with id 6f6b0f2932262a3851340c0c6f3f75bbc06d65bbccb42f60586f34850024833d Mar 18 16:51:12.562528 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.562511 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:51:12.583924 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:12.583890 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-fb9569db9-fc9qp"] Mar 18 16:51:12.586970 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:51:12.586919 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod248b934e_a7ba_4aca_aa9e_58fd8fb76b02.slice/crio-9c8d8729b8bb7b7efb776661b3f97eaf6b194745cf5ab25eb11b787562c00c1b WatchSource:0}: Error finding container 9c8d8729b8bb7b7efb776661b3f97eaf6b194745cf5ab25eb11b787562c00c1b: Status 404 returned error can't find the container with id 9c8d8729b8bb7b7efb776661b3f97eaf6b194745cf5ab25eb11b787562c00c1b Mar 18 16:51:13.013735 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:13.013701 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9bdd46fb7-pwjkv" event={"ID":"2b6c4a0a-92ca-4005-9d1c-9c856990fce1","Type":"ContainerStarted","Data":"6f6b0f2932262a3851340c0c6f3f75bbc06d65bbccb42f60586f34850024833d"} Mar 18 16:51:13.014634 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:13.014611 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fb9569db9-fc9qp" event={"ID":"248b934e-a7ba-4aca-aa9e-58fd8fb76b02","Type":"ContainerStarted","Data":"9c8d8729b8bb7b7efb776661b3f97eaf6b194745cf5ab25eb11b787562c00c1b"} Mar 18 16:51:16.026261 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:16.026219 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9bdd46fb7-pwjkv" event={"ID":"2b6c4a0a-92ca-4005-9d1c-9c856990fce1","Type":"ContainerStarted","Data":"31a66943561cffd1197a1b260ec1806ce052028afd324095f702d2cc07548d68"} Mar 18 16:51:16.042362 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:16.042305 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9bdd46fb7-pwjkv" podStartSLOduration=1.516058949 podStartE2EDuration="4.04228913s" podCreationTimestamp="2026-03-18 16:51:12 +0000 UTC" firstStartedPulling="2026-03-18 16:51:12.56264299 +0000 UTC m=+400.278065967" lastFinishedPulling="2026-03-18 16:51:15.088873158 +0000 UTC m=+402.804296148" observedRunningTime="2026-03-18 16:51:16.042028349 +0000 UTC m=+403.757451347" watchObservedRunningTime="2026-03-18 16:51:16.04228913 +0000 UTC m=+403.757712127" Mar 18 16:51:23.046061 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:23.046016 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fb9569db9-fc9qp" event={"ID":"248b934e-a7ba-4aca-aa9e-58fd8fb76b02","Type":"ContainerStarted","Data":"64762315603c4117dd15337424cfc3abf39d2eb7c8f1d8cb4f4beb676d2ac08a"} Mar 18 16:51:23.046542 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:23.046212 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fb9569db9-fc9qp" Mar 18 16:51:23.048038 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:23.048016 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fb9569db9-fc9qp" Mar 18 16:51:23.061874 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:23.061813 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fb9569db9-fc9qp" podStartSLOduration=1.020121903 podStartE2EDuration="11.061795454s" podCreationTimestamp="2026-03-18 16:51:12 +0000 UTC" firstStartedPulling="2026-03-18 16:51:12.588762083 +0000 UTC m=+400.304185062" lastFinishedPulling="2026-03-18 16:51:22.630435632 +0000 UTC m=+410.345858613" observedRunningTime="2026-03-18 16:51:23.06167519 +0000 UTC m=+410.777098210" watchObservedRunningTime="2026-03-18 16:51:23.061795454 +0000 UTC m=+410.777218453" Mar 18 16:51:24.820387 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:24.820350 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7"] Mar 18 16:51:24.823757 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:24.823738 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7" Mar 18 16:51:24.826295 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:24.826260 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Mar 18 16:51:24.826429 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:24.826335 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Mar 18 16:51:24.827487 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:24.827466 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lptfs\"" Mar 18 16:51:24.831810 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:24.831788 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7"] Mar 18 16:51:24.901490 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:24.901454 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68df7e04-e7ea-450e-b785-6ca78096bce9-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7\" (UID: \"68df7e04-e7ea-450e-b785-6ca78096bce9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7" Mar 18 16:51:24.901643 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:24.901505 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68df7e04-e7ea-450e-b785-6ca78096bce9-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7\" (UID: \"68df7e04-e7ea-450e-b785-6ca78096bce9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7" Mar 18 16:51:24.901643 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:24.901534 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs92z\" (UniqueName: \"kubernetes.io/projected/68df7e04-e7ea-450e-b785-6ca78096bce9-kube-api-access-qs92z\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7\" (UID: \"68df7e04-e7ea-450e-b785-6ca78096bce9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7" Mar 18 16:51:25.002319 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:25.002280 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68df7e04-e7ea-450e-b785-6ca78096bce9-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7\" (UID: \"68df7e04-e7ea-450e-b785-6ca78096bce9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7" Mar 18 16:51:25.002510 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:25.002339 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68df7e04-e7ea-450e-b785-6ca78096bce9-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7\" (UID: \"68df7e04-e7ea-450e-b785-6ca78096bce9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7" Mar 18 16:51:25.002510 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:25.002376 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qs92z\" (UniqueName: \"kubernetes.io/projected/68df7e04-e7ea-450e-b785-6ca78096bce9-kube-api-access-qs92z\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7\" (UID: \"68df7e04-e7ea-450e-b785-6ca78096bce9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7" Mar 18 16:51:25.002657 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:25.002636 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68df7e04-e7ea-450e-b785-6ca78096bce9-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7\" (UID: \"68df7e04-e7ea-450e-b785-6ca78096bce9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7" Mar 18 16:51:25.002722 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:25.002687 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68df7e04-e7ea-450e-b785-6ca78096bce9-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7\" (UID: \"68df7e04-e7ea-450e-b785-6ca78096bce9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7" Mar 18 16:51:25.012458 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:25.012420 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs92z\" (UniqueName: \"kubernetes.io/projected/68df7e04-e7ea-450e-b785-6ca78096bce9-kube-api-access-qs92z\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7\" (UID: \"68df7e04-e7ea-450e-b785-6ca78096bce9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7" Mar 18 16:51:25.133758 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:25.133668 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7" Mar 18 16:51:25.253589 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:25.253558 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7"] Mar 18 16:51:25.256975 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:51:25.256930 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68df7e04_e7ea_450e_b785_6ca78096bce9.slice/crio-ff096e6f3a2c85f591bdbc9998e090d14119c4a95b4e55e18c33cebad1949218 WatchSource:0}: Error finding container ff096e6f3a2c85f591bdbc9998e090d14119c4a95b4e55e18c33cebad1949218: Status 404 returned error can't find the container with id ff096e6f3a2c85f591bdbc9998e090d14119c4a95b4e55e18c33cebad1949218 Mar 18 16:51:26.055050 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:26.055012 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7" event={"ID":"68df7e04-e7ea-450e-b785-6ca78096bce9","Type":"ContainerStarted","Data":"ff096e6f3a2c85f591bdbc9998e090d14119c4a95b4e55e18c33cebad1949218"} Mar 18 16:51:33.076738 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:33.076700 2573 generic.go:358] "Generic (PLEG): container finished" podID="68df7e04-e7ea-450e-b785-6ca78096bce9" containerID="08e8a6e89b5e23aadd9e373a413505f46ef575a8f83b4b2b9aa6b53fe58a3b33" exitCode=0 Mar 18 16:51:33.077139 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:33.076780 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7" event={"ID":"68df7e04-e7ea-450e-b785-6ca78096bce9","Type":"ContainerDied","Data":"08e8a6e89b5e23aadd9e373a413505f46ef575a8f83b4b2b9aa6b53fe58a3b33"} Mar 18 16:51:35.089291 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:35.089203 2573 generic.go:358] "Generic (PLEG): container finished" podID="68df7e04-e7ea-450e-b785-6ca78096bce9" containerID="f82ac35e2bd35b3f8517fa35b1a86b2954ce6ea69f1a88a1a0d9be3ba1bf26f5" exitCode=0 Mar 18 16:51:35.089674 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:35.089288 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7" event={"ID":"68df7e04-e7ea-450e-b785-6ca78096bce9","Type":"ContainerDied","Data":"f82ac35e2bd35b3f8517fa35b1a86b2954ce6ea69f1a88a1a0d9be3ba1bf26f5"} Mar 18 16:51:42.111887 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:42.111850 2573 generic.go:358] "Generic (PLEG): container finished" podID="68df7e04-e7ea-450e-b785-6ca78096bce9" containerID="97cecbe7093830763f59ff986e1440ce73c5bea2f382c3b445e12e677178b490" exitCode=0 Mar 18 16:51:42.112448 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:42.111961 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7" event={"ID":"68df7e04-e7ea-450e-b785-6ca78096bce9","Type":"ContainerDied","Data":"97cecbe7093830763f59ff986e1440ce73c5bea2f382c3b445e12e677178b490"} Mar 18 16:51:43.235405 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:43.235379 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7" Mar 18 16:51:43.356347 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:43.356310 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68df7e04-e7ea-450e-b785-6ca78096bce9-bundle\") pod \"68df7e04-e7ea-450e-b785-6ca78096bce9\" (UID: \"68df7e04-e7ea-450e-b785-6ca78096bce9\") " Mar 18 16:51:43.356527 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:43.356420 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs92z\" (UniqueName: \"kubernetes.io/projected/68df7e04-e7ea-450e-b785-6ca78096bce9-kube-api-access-qs92z\") pod \"68df7e04-e7ea-450e-b785-6ca78096bce9\" (UID: \"68df7e04-e7ea-450e-b785-6ca78096bce9\") " Mar 18 16:51:43.356527 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:43.356445 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68df7e04-e7ea-450e-b785-6ca78096bce9-util\") pod \"68df7e04-e7ea-450e-b785-6ca78096bce9\" (UID: \"68df7e04-e7ea-450e-b785-6ca78096bce9\") " Mar 18 16:51:43.356980 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:43.356924 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68df7e04-e7ea-450e-b785-6ca78096bce9-bundle" (OuterVolumeSpecName: "bundle") pod "68df7e04-e7ea-450e-b785-6ca78096bce9" (UID: "68df7e04-e7ea-450e-b785-6ca78096bce9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:51:43.358707 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:43.358672 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68df7e04-e7ea-450e-b785-6ca78096bce9-kube-api-access-qs92z" (OuterVolumeSpecName: "kube-api-access-qs92z") pod "68df7e04-e7ea-450e-b785-6ca78096bce9" (UID: "68df7e04-e7ea-450e-b785-6ca78096bce9"). InnerVolumeSpecName "kube-api-access-qs92z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:51:43.361112 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:43.361091 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68df7e04-e7ea-450e-b785-6ca78096bce9-util" (OuterVolumeSpecName: "util") pod "68df7e04-e7ea-450e-b785-6ca78096bce9" (UID: "68df7e04-e7ea-450e-b785-6ca78096bce9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:51:43.457553 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:43.457515 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68df7e04-e7ea-450e-b785-6ca78096bce9-util\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:51:43.457553 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:43.457545 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68df7e04-e7ea-450e-b785-6ca78096bce9-bundle\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:51:43.457553 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:43.457555 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qs92z\" (UniqueName: \"kubernetes.io/projected/68df7e04-e7ea-450e-b785-6ca78096bce9-kube-api-access-qs92z\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:51:44.119518 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:44.119476 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7" event={"ID":"68df7e04-e7ea-450e-b785-6ca78096bce9","Type":"ContainerDied","Data":"ff096e6f3a2c85f591bdbc9998e090d14119c4a95b4e55e18c33cebad1949218"} Mar 18 16:51:44.119518 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:44.119523 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff096e6f3a2c85f591bdbc9998e090d14119c4a95b4e55e18c33cebad1949218" Mar 18 16:51:44.119737 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:44.119494 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs7ts7" Mar 18 16:51:46.584655 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:46.584619 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89k4h"] Mar 18 16:51:46.585078 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:46.584922 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68df7e04-e7ea-450e-b785-6ca78096bce9" containerName="pull" Mar 18 16:51:46.585078 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:46.584933 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="68df7e04-e7ea-450e-b785-6ca78096bce9" containerName="pull" Mar 18 16:51:46.585078 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:46.584973 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68df7e04-e7ea-450e-b785-6ca78096bce9" containerName="util" Mar 18 16:51:46.585078 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:46.584980 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="68df7e04-e7ea-450e-b785-6ca78096bce9" containerName="util" Mar 18 16:51:46.585078 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:46.584987 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68df7e04-e7ea-450e-b785-6ca78096bce9" containerName="extract" Mar 18 16:51:46.585078 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:46.584993 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="68df7e04-e7ea-450e-b785-6ca78096bce9" containerName="extract" Mar 18 16:51:46.585078 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:46.585051 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="68df7e04-e7ea-450e-b785-6ca78096bce9" containerName="extract" Mar 18 16:51:46.649483 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:46.649444 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89k4h"] Mar 18 16:51:46.649649 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:46.649580 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89k4h" Mar 18 16:51:46.652549 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:46.652522 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Mar 18 16:51:46.652549 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:46.652546 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Mar 18 16:51:46.652728 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:46.652584 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Mar 18 16:51:46.652728 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:46.652616 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-m8r4l\"" Mar 18 16:51:46.780808 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:46.780768 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/37bc003f-a9e0-428a-907a-b348fca87547-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-89k4h\" (UID: \"37bc003f-a9e0-428a-907a-b348fca87547\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89k4h" Mar 18 16:51:46.780808 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:46.780809 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfk5t\" (UniqueName: \"kubernetes.io/projected/37bc003f-a9e0-428a-907a-b348fca87547-kube-api-access-kfk5t\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-89k4h\" (UID: \"37bc003f-a9e0-428a-907a-b348fca87547\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89k4h" Mar 18 16:51:46.881518 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:46.881423 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/37bc003f-a9e0-428a-907a-b348fca87547-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-89k4h\" (UID: \"37bc003f-a9e0-428a-907a-b348fca87547\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89k4h" Mar 18 16:51:46.881518 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:46.881466 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfk5t\" (UniqueName: \"kubernetes.io/projected/37bc003f-a9e0-428a-907a-b348fca87547-kube-api-access-kfk5t\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-89k4h\" (UID: \"37bc003f-a9e0-428a-907a-b348fca87547\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89k4h" Mar 18 16:51:46.883856 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:46.883826 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/37bc003f-a9e0-428a-907a-b348fca87547-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-89k4h\" (UID: \"37bc003f-a9e0-428a-907a-b348fca87547\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89k4h" Mar 18 16:51:46.890617 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:46.890588 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfk5t\" (UniqueName: \"kubernetes.io/projected/37bc003f-a9e0-428a-907a-b348fca87547-kube-api-access-kfk5t\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-89k4h\" (UID: \"37bc003f-a9e0-428a-907a-b348fca87547\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89k4h" Mar 18 16:51:46.960637 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:46.960590 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89k4h" Mar 18 16:51:47.100068 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:47.100025 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89k4h"] Mar 18 16:51:47.104703 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:51:47.104666 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37bc003f_a9e0_428a_907a_b348fca87547.slice/crio-3d3472bada614f539a861c1b2f85c606881d13cdaa70b4efce5ee5f00862254f WatchSource:0}: Error finding container 3d3472bada614f539a861c1b2f85c606881d13cdaa70b4efce5ee5f00862254f: Status 404 returned error can't find the container with id 3d3472bada614f539a861c1b2f85c606881d13cdaa70b4efce5ee5f00862254f Mar 18 16:51:47.129480 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:47.129439 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89k4h" event={"ID":"37bc003f-a9e0-428a-907a-b348fca87547","Type":"ContainerStarted","Data":"3d3472bada614f539a861c1b2f85c606881d13cdaa70b4efce5ee5f00862254f"} Mar 18 16:51:51.142490 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.142397 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89k4h" event={"ID":"37bc003f-a9e0-428a-907a-b348fca87547","Type":"ContainerStarted","Data":"58844af2e1938aecfd8279113e761adeb2b25ed0acd3b468188db8a37fb93514"} Mar 18 16:51:51.142863 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.142508 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89k4h" Mar 18 16:51:51.164696 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.164638 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89k4h" podStartSLOduration=1.535140498 podStartE2EDuration="5.164622459s" podCreationTimestamp="2026-03-18 16:51:46 +0000 UTC" firstStartedPulling="2026-03-18 16:51:47.106399196 +0000 UTC m=+434.821822172" lastFinishedPulling="2026-03-18 16:51:50.735881143 +0000 UTC m=+438.451304133" observedRunningTime="2026-03-18 16:51:51.163068351 +0000 UTC m=+438.878491348" watchObservedRunningTime="2026-03-18 16:51:51.164622459 +0000 UTC m=+438.880045456" Mar 18 16:51:51.281536 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.281496 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-9lknb"] Mar 18 16:51:51.284909 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.284888 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-9lknb" Mar 18 16:51:51.287416 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.287390 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Mar 18 16:51:51.287529 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.287390 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Mar 18 16:51:51.287529 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.287517 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-97v2s\"" Mar 18 16:51:51.294415 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.294389 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-9lknb"] Mar 18 16:51:51.416974 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.416863 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/10f9450a-6b8e-4203-9624-c73723b51f7c-cabundle0\") pod \"keda-operator-ffbb595cb-9lknb\" (UID: \"10f9450a-6b8e-4203-9624-c73723b51f7c\") " pod="openshift-keda/keda-operator-ffbb595cb-9lknb" Mar 18 16:51:51.416974 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.416915 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvn88\" (UniqueName: \"kubernetes.io/projected/10f9450a-6b8e-4203-9624-c73723b51f7c-kube-api-access-gvn88\") pod \"keda-operator-ffbb595cb-9lknb\" (UID: \"10f9450a-6b8e-4203-9624-c73723b51f7c\") " pod="openshift-keda/keda-operator-ffbb595cb-9lknb" Mar 18 16:51:51.417259 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.417090 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/10f9450a-6b8e-4203-9624-c73723b51f7c-certificates\") pod \"keda-operator-ffbb595cb-9lknb\" (UID: \"10f9450a-6b8e-4203-9624-c73723b51f7c\") " pod="openshift-keda/keda-operator-ffbb595cb-9lknb" Mar 18 16:51:51.474396 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.474359 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-ntdt2"] Mar 18 16:51:51.477475 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.477454 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ntdt2" Mar 18 16:51:51.480071 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.480044 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Mar 18 16:51:51.492412 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.492382 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-ntdt2"] Mar 18 16:51:51.518044 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.518001 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/10f9450a-6b8e-4203-9624-c73723b51f7c-certificates\") pod \"keda-operator-ffbb595cb-9lknb\" (UID: \"10f9450a-6b8e-4203-9624-c73723b51f7c\") " pod="openshift-keda/keda-operator-ffbb595cb-9lknb" Mar 18 16:51:51.518229 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.518064 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/10f9450a-6b8e-4203-9624-c73723b51f7c-cabundle0\") pod \"keda-operator-ffbb595cb-9lknb\" (UID: \"10f9450a-6b8e-4203-9624-c73723b51f7c\") " pod="openshift-keda/keda-operator-ffbb595cb-9lknb" Mar 18 16:51:51.518229 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.518108 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvn88\" (UniqueName: \"kubernetes.io/projected/10f9450a-6b8e-4203-9624-c73723b51f7c-kube-api-access-gvn88\") pod \"keda-operator-ffbb595cb-9lknb\" (UID: \"10f9450a-6b8e-4203-9624-c73723b51f7c\") " pod="openshift-keda/keda-operator-ffbb595cb-9lknb" Mar 18 16:51:51.518229 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:51:51.518175 2573 secret.go:281] references non-existent secret key: ca.crt Mar 18 16:51:51.518229 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:51:51.518198 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Mar 18 16:51:51.518229 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:51:51.518214 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-9lknb: references non-existent secret key: ca.crt Mar 18 16:51:51.518495 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:51:51.518290 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10f9450a-6b8e-4203-9624-c73723b51f7c-certificates podName:10f9450a-6b8e-4203-9624-c73723b51f7c nodeName:}" failed. No retries permitted until 2026-03-18 16:51:52.018271169 +0000 UTC m=+439.733694150 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/10f9450a-6b8e-4203-9624-c73723b51f7c-certificates") pod "keda-operator-ffbb595cb-9lknb" (UID: "10f9450a-6b8e-4203-9624-c73723b51f7c") : references non-existent secret key: ca.crt Mar 18 16:51:51.518886 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.518862 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/10f9450a-6b8e-4203-9624-c73723b51f7c-cabundle0\") pod \"keda-operator-ffbb595cb-9lknb\" (UID: \"10f9450a-6b8e-4203-9624-c73723b51f7c\") " pod="openshift-keda/keda-operator-ffbb595cb-9lknb" Mar 18 16:51:51.530552 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.530519 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvn88\" (UniqueName: \"kubernetes.io/projected/10f9450a-6b8e-4203-9624-c73723b51f7c-kube-api-access-gvn88\") pod \"keda-operator-ffbb595cb-9lknb\" (UID: \"10f9450a-6b8e-4203-9624-c73723b51f7c\") " pod="openshift-keda/keda-operator-ffbb595cb-9lknb" Mar 18 16:51:51.619487 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.619438 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/18937e07-d56b-436e-aa18-5d812225e9aa-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ntdt2\" (UID: \"18937e07-d56b-436e-aa18-5d812225e9aa\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ntdt2" Mar 18 16:51:51.619487 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.619477 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwlbm\" (UniqueName: \"kubernetes.io/projected/18937e07-d56b-436e-aa18-5d812225e9aa-kube-api-access-qwlbm\") pod \"keda-metrics-apiserver-7c9f485588-ntdt2\" (UID: \"18937e07-d56b-436e-aa18-5d812225e9aa\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ntdt2" Mar 18 16:51:51.619750 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.619522 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/18937e07-d56b-436e-aa18-5d812225e9aa-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-ntdt2\" (UID: \"18937e07-d56b-436e-aa18-5d812225e9aa\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ntdt2" Mar 18 16:51:51.720519 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.720481 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/18937e07-d56b-436e-aa18-5d812225e9aa-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-ntdt2\" (UID: \"18937e07-d56b-436e-aa18-5d812225e9aa\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ntdt2" Mar 18 16:51:51.720694 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.720614 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/18937e07-d56b-436e-aa18-5d812225e9aa-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ntdt2\" (UID: \"18937e07-d56b-436e-aa18-5d812225e9aa\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ntdt2" Mar 18 16:51:51.720694 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.720643 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwlbm\" (UniqueName: \"kubernetes.io/projected/18937e07-d56b-436e-aa18-5d812225e9aa-kube-api-access-qwlbm\") pod \"keda-metrics-apiserver-7c9f485588-ntdt2\" (UID: \"18937e07-d56b-436e-aa18-5d812225e9aa\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ntdt2" Mar 18 16:51:51.720810 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.720733 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/18937e07-d56b-436e-aa18-5d812225e9aa-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-ntdt2\" (UID: \"18937e07-d56b-436e-aa18-5d812225e9aa\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ntdt2" Mar 18 16:51:51.723365 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.723335 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/18937e07-d56b-436e-aa18-5d812225e9aa-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ntdt2\" (UID: \"18937e07-d56b-436e-aa18-5d812225e9aa\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ntdt2" Mar 18 16:51:51.731601 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.731565 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwlbm\" (UniqueName: \"kubernetes.io/projected/18937e07-d56b-436e-aa18-5d812225e9aa-kube-api-access-qwlbm\") pod \"keda-metrics-apiserver-7c9f485588-ntdt2\" (UID: \"18937e07-d56b-436e-aa18-5d812225e9aa\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ntdt2" Mar 18 16:51:51.788470 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.788433 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ntdt2" Mar 18 16:51:51.939650 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:51.939613 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-ntdt2"] Mar 18 16:51:51.942748 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:51:51.942719 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18937e07_d56b_436e_aa18_5d812225e9aa.slice/crio-a8d0b9bbbda63a86e36141277164285881a707f471f2c652446232fb764ed6c8 WatchSource:0}: Error finding container a8d0b9bbbda63a86e36141277164285881a707f471f2c652446232fb764ed6c8: Status 404 returned error can't find the container with id a8d0b9bbbda63a86e36141277164285881a707f471f2c652446232fb764ed6c8 Mar 18 16:51:52.023500 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:52.023405 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/10f9450a-6b8e-4203-9624-c73723b51f7c-certificates\") pod \"keda-operator-ffbb595cb-9lknb\" (UID: \"10f9450a-6b8e-4203-9624-c73723b51f7c\") " pod="openshift-keda/keda-operator-ffbb595cb-9lknb" Mar 18 16:51:52.025966 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:52.025929 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/10f9450a-6b8e-4203-9624-c73723b51f7c-certificates\") pod \"keda-operator-ffbb595cb-9lknb\" (UID: \"10f9450a-6b8e-4203-9624-c73723b51f7c\") " pod="openshift-keda/keda-operator-ffbb595cb-9lknb" Mar 18 16:51:52.147919 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:52.147877 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ntdt2" event={"ID":"18937e07-d56b-436e-aa18-5d812225e9aa","Type":"ContainerStarted","Data":"a8d0b9bbbda63a86e36141277164285881a707f471f2c652446232fb764ed6c8"} Mar 18 16:51:52.197524 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:52.197487 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-9lknb" Mar 18 16:51:52.333152 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:52.333127 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-9lknb"] Mar 18 16:51:52.335649 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:51:52.335617 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10f9450a_6b8e_4203_9624_c73723b51f7c.slice/crio-f6975855fd5ef3c832cbb7a18e06e64d141593d97db3b7009421eb854649266d WatchSource:0}: Error finding container f6975855fd5ef3c832cbb7a18e06e64d141593d97db3b7009421eb854649266d: Status 404 returned error can't find the container with id f6975855fd5ef3c832cbb7a18e06e64d141593d97db3b7009421eb854649266d Mar 18 16:51:53.152812 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:53.152766 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-9lknb" event={"ID":"10f9450a-6b8e-4203-9624-c73723b51f7c","Type":"ContainerStarted","Data":"f6975855fd5ef3c832cbb7a18e06e64d141593d97db3b7009421eb854649266d"} Mar 18 16:51:56.167216 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:56.167113 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ntdt2" event={"ID":"18937e07-d56b-436e-aa18-5d812225e9aa","Type":"ContainerStarted","Data":"229e62f2b9b1540edee836b0f2f5664aa9a72f83cb30edaa97d37c2c2a4d7859"} Mar 18 16:51:56.167216 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:56.167209 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ntdt2" Mar 18 16:51:56.168636 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:56.168608 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-9lknb" event={"ID":"10f9450a-6b8e-4203-9624-c73723b51f7c","Type":"ContainerStarted","Data":"342eaeba4f2e049165ac93c6eb3e73b82f4e9d714ed05ec94de93c1497afba22"} Mar 18 16:51:56.168806 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:56.168741 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-9lknb" Mar 18 16:51:56.185085 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:56.185026 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ntdt2" podStartSLOduration=1.2768707959999999 podStartE2EDuration="5.184996674s" podCreationTimestamp="2026-03-18 16:51:51 +0000 UTC" firstStartedPulling="2026-03-18 16:51:51.944656703 +0000 UTC m=+439.660079694" lastFinishedPulling="2026-03-18 16:51:55.852782594 +0000 UTC m=+443.568205572" observedRunningTime="2026-03-18 16:51:56.184473397 +0000 UTC m=+443.899896396" watchObservedRunningTime="2026-03-18 16:51:56.184996674 +0000 UTC m=+443.900419672" Mar 18 16:51:56.205891 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:51:56.205836 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-9lknb" podStartSLOduration=1.683870865 podStartE2EDuration="5.205822206s" podCreationTimestamp="2026-03-18 16:51:51 +0000 UTC" firstStartedPulling="2026-03-18 16:51:52.33714171 +0000 UTC m=+440.052564686" lastFinishedPulling="2026-03-18 16:51:55.859093044 +0000 UTC m=+443.574516027" observedRunningTime="2026-03-18 16:51:56.204355753 +0000 UTC m=+443.919778743" watchObservedRunningTime="2026-03-18 16:51:56.205822206 +0000 UTC m=+443.921245214" Mar 18 16:52:07.175477 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:07.175434 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ntdt2" Mar 18 16:52:12.149931 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:12.149896 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-89k4h" Mar 18 16:52:17.173403 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:17.173369 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-9lknb" Mar 18 16:52:44.826871 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:44.826829 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr"] Mar 18 16:52:44.838260 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:44.838230 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr" Mar 18 16:52:44.840326 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:44.840291 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr"] Mar 18 16:52:44.840716 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:44.840692 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Mar 18 16:52:44.840827 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:44.840713 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lptfs\"" Mar 18 16:52:44.841833 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:44.841810 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Mar 18 16:52:44.960254 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:44.960209 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a52505a5-2f0c-4b60-ab48-1b449baf89fd-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr\" (UID: \"a52505a5-2f0c-4b60-ab48-1b449baf89fd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr" Mar 18 16:52:44.960453 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:44.960265 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtcnj\" (UniqueName: \"kubernetes.io/projected/a52505a5-2f0c-4b60-ab48-1b449baf89fd-kube-api-access-rtcnj\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr\" (UID: \"a52505a5-2f0c-4b60-ab48-1b449baf89fd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr" Mar 18 16:52:44.960453 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:44.960341 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a52505a5-2f0c-4b60-ab48-1b449baf89fd-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr\" (UID: \"a52505a5-2f0c-4b60-ab48-1b449baf89fd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr" Mar 18 16:52:45.061172 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:45.061131 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a52505a5-2f0c-4b60-ab48-1b449baf89fd-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr\" (UID: \"a52505a5-2f0c-4b60-ab48-1b449baf89fd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr" Mar 18 16:52:45.061334 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:45.061184 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtcnj\" (UniqueName: \"kubernetes.io/projected/a52505a5-2f0c-4b60-ab48-1b449baf89fd-kube-api-access-rtcnj\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr\" (UID: \"a52505a5-2f0c-4b60-ab48-1b449baf89fd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr" Mar 18 16:52:45.061334 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:45.061217 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a52505a5-2f0c-4b60-ab48-1b449baf89fd-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr\" (UID: \"a52505a5-2f0c-4b60-ab48-1b449baf89fd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr" Mar 18 16:52:45.061598 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:45.061577 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a52505a5-2f0c-4b60-ab48-1b449baf89fd-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr\" (UID: \"a52505a5-2f0c-4b60-ab48-1b449baf89fd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr" Mar 18 16:52:45.061640 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:45.061588 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a52505a5-2f0c-4b60-ab48-1b449baf89fd-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr\" (UID: \"a52505a5-2f0c-4b60-ab48-1b449baf89fd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr" Mar 18 16:52:45.069641 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:45.069609 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtcnj\" (UniqueName: \"kubernetes.io/projected/a52505a5-2f0c-4b60-ab48-1b449baf89fd-kube-api-access-rtcnj\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr\" (UID: \"a52505a5-2f0c-4b60-ab48-1b449baf89fd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr" Mar 18 16:52:45.148925 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:45.148834 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr" Mar 18 16:52:45.275015 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:45.274902 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr"] Mar 18 16:52:45.277794 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:52:45.277764 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda52505a5_2f0c_4b60_ab48_1b449baf89fd.slice/crio-4adc90b9a4ec954453b5e93a854fa59045c3544e5d88a507b4fabb9b2bbe25f8 WatchSource:0}: Error finding container 4adc90b9a4ec954453b5e93a854fa59045c3544e5d88a507b4fabb9b2bbe25f8: Status 404 returned error can't find the container with id 4adc90b9a4ec954453b5e93a854fa59045c3544e5d88a507b4fabb9b2bbe25f8 Mar 18 16:52:45.319240 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:45.319207 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr" event={"ID":"a52505a5-2f0c-4b60-ab48-1b449baf89fd","Type":"ContainerStarted","Data":"4adc90b9a4ec954453b5e93a854fa59045c3544e5d88a507b4fabb9b2bbe25f8"} Mar 18 16:52:46.323314 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:46.323283 2573 generic.go:358] "Generic (PLEG): container finished" podID="a52505a5-2f0c-4b60-ab48-1b449baf89fd" containerID="978c5a3624b3b67ca18764459a6e3f80757d910e315be2378b23694adcfcee38" exitCode=0 Mar 18 16:52:46.323695 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:46.323374 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr" event={"ID":"a52505a5-2f0c-4b60-ab48-1b449baf89fd","Type":"ContainerDied","Data":"978c5a3624b3b67ca18764459a6e3f80757d910e315be2378b23694adcfcee38"} Mar 18 16:52:48.331995 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:48.331930 2573 generic.go:358] "Generic (PLEG): container finished" podID="a52505a5-2f0c-4b60-ab48-1b449baf89fd" containerID="da7cd14e9f9be5174228503f493d80cccca70328572f775c84b2da6a53320a07" exitCode=0 Mar 18 16:52:48.331995 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:48.331981 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr" event={"ID":"a52505a5-2f0c-4b60-ab48-1b449baf89fd","Type":"ContainerDied","Data":"da7cd14e9f9be5174228503f493d80cccca70328572f775c84b2da6a53320a07"} Mar 18 16:52:49.337087 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:49.337052 2573 generic.go:358] "Generic (PLEG): container finished" podID="a52505a5-2f0c-4b60-ab48-1b449baf89fd" containerID="806684e679a847ef2fb851251a2f38846e2fdfcb8736538e7b1a176d1a7ad100" exitCode=0 Mar 18 16:52:49.337482 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:49.337142 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr" event={"ID":"a52505a5-2f0c-4b60-ab48-1b449baf89fd","Type":"ContainerDied","Data":"806684e679a847ef2fb851251a2f38846e2fdfcb8736538e7b1a176d1a7ad100"} Mar 18 16:52:50.460721 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:50.460696 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr" Mar 18 16:52:50.605669 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:50.605582 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a52505a5-2f0c-4b60-ab48-1b449baf89fd-bundle\") pod \"a52505a5-2f0c-4b60-ab48-1b449baf89fd\" (UID: \"a52505a5-2f0c-4b60-ab48-1b449baf89fd\") " Mar 18 16:52:50.605669 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:50.605646 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a52505a5-2f0c-4b60-ab48-1b449baf89fd-util\") pod \"a52505a5-2f0c-4b60-ab48-1b449baf89fd\" (UID: \"a52505a5-2f0c-4b60-ab48-1b449baf89fd\") " Mar 18 16:52:50.605903 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:50.605678 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtcnj\" (UniqueName: \"kubernetes.io/projected/a52505a5-2f0c-4b60-ab48-1b449baf89fd-kube-api-access-rtcnj\") pod \"a52505a5-2f0c-4b60-ab48-1b449baf89fd\" (UID: \"a52505a5-2f0c-4b60-ab48-1b449baf89fd\") " Mar 18 16:52:50.606239 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:50.606215 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a52505a5-2f0c-4b60-ab48-1b449baf89fd-bundle" (OuterVolumeSpecName: "bundle") pod "a52505a5-2f0c-4b60-ab48-1b449baf89fd" (UID: "a52505a5-2f0c-4b60-ab48-1b449baf89fd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:52:50.607764 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:50.607736 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a52505a5-2f0c-4b60-ab48-1b449baf89fd-kube-api-access-rtcnj" (OuterVolumeSpecName: "kube-api-access-rtcnj") pod "a52505a5-2f0c-4b60-ab48-1b449baf89fd" (UID: "a52505a5-2f0c-4b60-ab48-1b449baf89fd"). InnerVolumeSpecName "kube-api-access-rtcnj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:52:50.611577 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:50.611537 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a52505a5-2f0c-4b60-ab48-1b449baf89fd-util" (OuterVolumeSpecName: "util") pod "a52505a5-2f0c-4b60-ab48-1b449baf89fd" (UID: "a52505a5-2f0c-4b60-ab48-1b449baf89fd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:52:50.706955 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:50.706887 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a52505a5-2f0c-4b60-ab48-1b449baf89fd-util\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:52:50.706955 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:50.706933 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rtcnj\" (UniqueName: \"kubernetes.io/projected/a52505a5-2f0c-4b60-ab48-1b449baf89fd-kube-api-access-rtcnj\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:52:50.706955 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:50.706967 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a52505a5-2f0c-4b60-ab48-1b449baf89fd-bundle\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:52:51.344566 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:51.344533 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr" event={"ID":"a52505a5-2f0c-4b60-ab48-1b449baf89fd","Type":"ContainerDied","Data":"4adc90b9a4ec954453b5e93a854fa59045c3544e5d88a507b4fabb9b2bbe25f8"} Mar 18 16:52:51.344566 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:51.344567 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4adc90b9a4ec954453b5e93a854fa59045c3544e5d88a507b4fabb9b2bbe25f8" Mar 18 16:52:51.344764 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:51.344594 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5wtmhr" Mar 18 16:52:57.300145 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:57.300109 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-4djlj"] Mar 18 16:52:57.300528 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:57.300425 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a52505a5-2f0c-4b60-ab48-1b449baf89fd" containerName="util" Mar 18 16:52:57.300528 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:57.300436 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52505a5-2f0c-4b60-ab48-1b449baf89fd" containerName="util" Mar 18 16:52:57.300528 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:57.300447 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a52505a5-2f0c-4b60-ab48-1b449baf89fd" containerName="extract" Mar 18 16:52:57.300528 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:57.300454 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52505a5-2f0c-4b60-ab48-1b449baf89fd" containerName="extract" Mar 18 16:52:57.300528 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:57.300462 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a52505a5-2f0c-4b60-ab48-1b449baf89fd" containerName="pull" Mar 18 16:52:57.300528 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:57.300467 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52505a5-2f0c-4b60-ab48-1b449baf89fd" containerName="pull" Mar 18 16:52:57.300528 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:57.300524 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a52505a5-2f0c-4b60-ab48-1b449baf89fd" containerName="extract" Mar 18 16:52:57.303680 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:57.303663 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-4djlj" Mar 18 16:52:57.306270 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:57.306248 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Mar 18 16:52:57.306432 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:57.306306 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:52:57.306432 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:57.306332 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-kd86k\"" Mar 18 16:52:57.312735 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:57.312703 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-4djlj"] Mar 18 16:52:57.459093 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:57.459046 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fe1378b9-c6d2-456f-9db4-9bd54901c12c-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-4djlj\" (UID: \"fe1378b9-c6d2-456f-9db4-9bd54901c12c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-4djlj" Mar 18 16:52:57.459263 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:57.459099 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqrl7\" (UniqueName: \"kubernetes.io/projected/fe1378b9-c6d2-456f-9db4-9bd54901c12c-kube-api-access-qqrl7\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-4djlj\" (UID: \"fe1378b9-c6d2-456f-9db4-9bd54901c12c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-4djlj" Mar 18 16:52:57.560495 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:57.560401 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fe1378b9-c6d2-456f-9db4-9bd54901c12c-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-4djlj\" (UID: \"fe1378b9-c6d2-456f-9db4-9bd54901c12c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-4djlj" Mar 18 16:52:57.560495 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:57.560449 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqrl7\" (UniqueName: \"kubernetes.io/projected/fe1378b9-c6d2-456f-9db4-9bd54901c12c-kube-api-access-qqrl7\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-4djlj\" (UID: \"fe1378b9-c6d2-456f-9db4-9bd54901c12c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-4djlj" Mar 18 16:52:57.560825 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:57.560803 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fe1378b9-c6d2-456f-9db4-9bd54901c12c-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-4djlj\" (UID: \"fe1378b9-c6d2-456f-9db4-9bd54901c12c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-4djlj" Mar 18 16:52:57.568959 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:57.568918 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqrl7\" (UniqueName: \"kubernetes.io/projected/fe1378b9-c6d2-456f-9db4-9bd54901c12c-kube-api-access-qqrl7\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-4djlj\" (UID: \"fe1378b9-c6d2-456f-9db4-9bd54901c12c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-4djlj" Mar 18 16:52:57.613896 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:57.613858 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-4djlj" Mar 18 16:52:57.756576 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:57.756550 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-4djlj"] Mar 18 16:52:57.759652 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:52:57.759626 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe1378b9_c6d2_456f_9db4_9bd54901c12c.slice/crio-eeed11074c179b325c01a616df1d48a759866f4c486ac3e110d5caffe1b4ceb2 WatchSource:0}: Error finding container eeed11074c179b325c01a616df1d48a759866f4c486ac3e110d5caffe1b4ceb2: Status 404 returned error can't find the container with id eeed11074c179b325c01a616df1d48a759866f4c486ac3e110d5caffe1b4ceb2 Mar 18 16:52:58.365924 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:58.365881 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-4djlj" event={"ID":"fe1378b9-c6d2-456f-9db4-9bd54901c12c","Type":"ContainerStarted","Data":"eeed11074c179b325c01a616df1d48a759866f4c486ac3e110d5caffe1b4ceb2"} Mar 18 16:52:59.174366 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:59.174329 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg"] Mar 18 16:52:59.177733 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:59.177710 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg" Mar 18 16:52:59.180455 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:59.180431 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Mar 18 16:52:59.181414 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:59.181387 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lptfs\"" Mar 18 16:52:59.181525 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:59.181403 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Mar 18 16:52:59.187215 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:59.187024 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg"] Mar 18 16:52:59.272308 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:59.272273 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/147c9854-cf8e-485f-96f7-31e50cdd7c34-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg\" (UID: \"147c9854-cf8e-485f-96f7-31e50cdd7c34\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg" Mar 18 16:52:59.272505 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:59.272337 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/147c9854-cf8e-485f-96f7-31e50cdd7c34-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg\" (UID: \"147c9854-cf8e-485f-96f7-31e50cdd7c34\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg" Mar 18 16:52:59.272505 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:59.272430 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29tk9\" (UniqueName: \"kubernetes.io/projected/147c9854-cf8e-485f-96f7-31e50cdd7c34-kube-api-access-29tk9\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg\" (UID: \"147c9854-cf8e-485f-96f7-31e50cdd7c34\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg" Mar 18 16:52:59.372825 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:59.372791 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/147c9854-cf8e-485f-96f7-31e50cdd7c34-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg\" (UID: \"147c9854-cf8e-485f-96f7-31e50cdd7c34\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg" Mar 18 16:52:59.373288 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:59.372865 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/147c9854-cf8e-485f-96f7-31e50cdd7c34-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg\" (UID: \"147c9854-cf8e-485f-96f7-31e50cdd7c34\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg" Mar 18 16:52:59.373288 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:59.372911 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29tk9\" (UniqueName: \"kubernetes.io/projected/147c9854-cf8e-485f-96f7-31e50cdd7c34-kube-api-access-29tk9\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg\" (UID: \"147c9854-cf8e-485f-96f7-31e50cdd7c34\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg" Mar 18 16:52:59.373288 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:59.373182 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/147c9854-cf8e-485f-96f7-31e50cdd7c34-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg\" (UID: \"147c9854-cf8e-485f-96f7-31e50cdd7c34\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg" Mar 18 16:52:59.373288 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:59.373211 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/147c9854-cf8e-485f-96f7-31e50cdd7c34-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg\" (UID: \"147c9854-cf8e-485f-96f7-31e50cdd7c34\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg" Mar 18 16:52:59.382420 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:59.382378 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29tk9\" (UniqueName: \"kubernetes.io/projected/147c9854-cf8e-485f-96f7-31e50cdd7c34-kube-api-access-29tk9\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg\" (UID: \"147c9854-cf8e-485f-96f7-31e50cdd7c34\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg" Mar 18 16:52:59.490078 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:59.490039 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg" Mar 18 16:52:59.982654 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:52:59.982628 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg"] Mar 18 16:52:59.984898 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:52:59.984869 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod147c9854_cf8e_485f_96f7_31e50cdd7c34.slice/crio-7d6dee33e4091685d2b5095735886f3991c9b97f29303773cdf9f796f8e61f7a WatchSource:0}: Error finding container 7d6dee33e4091685d2b5095735886f3991c9b97f29303773cdf9f796f8e61f7a: Status 404 returned error can't find the container with id 7d6dee33e4091685d2b5095735886f3991c9b97f29303773cdf9f796f8e61f7a Mar 18 16:53:00.374593 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:00.374541 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-4djlj" event={"ID":"fe1378b9-c6d2-456f-9db4-9bd54901c12c","Type":"ContainerStarted","Data":"b8abd3c324c6547f25c6def4779efb2aa190041d517607df058652380da6d748"} Mar 18 16:53:00.376202 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:00.376179 2573 generic.go:358] "Generic (PLEG): container finished" podID="147c9854-cf8e-485f-96f7-31e50cdd7c34" containerID="eae25c2b547c4631a47531589dbbce1fef63f29f7060516d4ff904f741a7d737" exitCode=0 Mar 18 16:53:00.376325 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:00.376243 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg" event={"ID":"147c9854-cf8e-485f-96f7-31e50cdd7c34","Type":"ContainerDied","Data":"eae25c2b547c4631a47531589dbbce1fef63f29f7060516d4ff904f741a7d737"} Mar 18 16:53:00.376325 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:00.376261 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg" event={"ID":"147c9854-cf8e-485f-96f7-31e50cdd7c34","Type":"ContainerStarted","Data":"7d6dee33e4091685d2b5095735886f3991c9b97f29303773cdf9f796f8e61f7a"} Mar 18 16:53:00.397709 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:00.397649 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-4djlj" podStartSLOduration=1.240264684 podStartE2EDuration="3.39763366s" podCreationTimestamp="2026-03-18 16:52:57 +0000 UTC" firstStartedPulling="2026-03-18 16:52:57.762114629 +0000 UTC m=+505.477537604" lastFinishedPulling="2026-03-18 16:52:59.919483595 +0000 UTC m=+507.634906580" observedRunningTime="2026-03-18 16:53:00.396417778 +0000 UTC m=+508.111840776" watchObservedRunningTime="2026-03-18 16:53:00.39763366 +0000 UTC m=+508.113056658" Mar 18 16:53:03.391512 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:03.391475 2573 generic.go:358] "Generic (PLEG): container finished" podID="147c9854-cf8e-485f-96f7-31e50cdd7c34" containerID="ed71e967169b57ac9b7358174ce58a967f53f16715610e58ca033235fd39c67f" exitCode=0 Mar 18 16:53:03.391907 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:03.391530 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg" event={"ID":"147c9854-cf8e-485f-96f7-31e50cdd7c34","Type":"ContainerDied","Data":"ed71e967169b57ac9b7358174ce58a967f53f16715610e58ca033235fd39c67f"} Mar 18 16:53:04.396710 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:04.396678 2573 generic.go:358] "Generic (PLEG): container finished" podID="147c9854-cf8e-485f-96f7-31e50cdd7c34" containerID="c01216c81e0457030de7f310b4af9c30b57eb9c121caef4bbc0b2ea25800465c" exitCode=0 Mar 18 16:53:04.397134 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:04.396771 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg" event={"ID":"147c9854-cf8e-485f-96f7-31e50cdd7c34","Type":"ContainerDied","Data":"c01216c81e0457030de7f310b4af9c30b57eb9c121caef4bbc0b2ea25800465c"} Mar 18 16:53:05.523345 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:05.523321 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg" Mar 18 16:53:05.625776 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:05.625699 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/147c9854-cf8e-485f-96f7-31e50cdd7c34-util\") pod \"147c9854-cf8e-485f-96f7-31e50cdd7c34\" (UID: \"147c9854-cf8e-485f-96f7-31e50cdd7c34\") " Mar 18 16:53:05.625974 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:05.625820 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/147c9854-cf8e-485f-96f7-31e50cdd7c34-bundle\") pod \"147c9854-cf8e-485f-96f7-31e50cdd7c34\" (UID: \"147c9854-cf8e-485f-96f7-31e50cdd7c34\") " Mar 18 16:53:05.625974 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:05.625907 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29tk9\" (UniqueName: \"kubernetes.io/projected/147c9854-cf8e-485f-96f7-31e50cdd7c34-kube-api-access-29tk9\") pod \"147c9854-cf8e-485f-96f7-31e50cdd7c34\" (UID: \"147c9854-cf8e-485f-96f7-31e50cdd7c34\") " Mar 18 16:53:05.626259 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:05.626236 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/147c9854-cf8e-485f-96f7-31e50cdd7c34-bundle" (OuterVolumeSpecName: "bundle") pod "147c9854-cf8e-485f-96f7-31e50cdd7c34" (UID: "147c9854-cf8e-485f-96f7-31e50cdd7c34"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:53:05.628151 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:05.628122 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/147c9854-cf8e-485f-96f7-31e50cdd7c34-kube-api-access-29tk9" (OuterVolumeSpecName: "kube-api-access-29tk9") pod "147c9854-cf8e-485f-96f7-31e50cdd7c34" (UID: "147c9854-cf8e-485f-96f7-31e50cdd7c34"). InnerVolumeSpecName "kube-api-access-29tk9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:53:05.630544 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:05.630522 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/147c9854-cf8e-485f-96f7-31e50cdd7c34-util" (OuterVolumeSpecName: "util") pod "147c9854-cf8e-485f-96f7-31e50cdd7c34" (UID: "147c9854-cf8e-485f-96f7-31e50cdd7c34"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:53:05.726872 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:05.726830 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-29tk9\" (UniqueName: \"kubernetes.io/projected/147c9854-cf8e-485f-96f7-31e50cdd7c34-kube-api-access-29tk9\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:53:05.726872 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:05.726867 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/147c9854-cf8e-485f-96f7-31e50cdd7c34-util\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:53:05.726872 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:05.726878 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/147c9854-cf8e-485f-96f7-31e50cdd7c34-bundle\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:53:06.405415 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:06.405377 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg" event={"ID":"147c9854-cf8e-485f-96f7-31e50cdd7c34","Type":"ContainerDied","Data":"7d6dee33e4091685d2b5095735886f3991c9b97f29303773cdf9f796f8e61f7a"} Mar 18 16:53:06.405415 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:06.405411 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d6dee33e4091685d2b5095735886f3991c9b97f29303773cdf9f796f8e61f7a" Mar 18 16:53:06.405415 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:06.405387 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftztcg" Mar 18 16:53:09.628161 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:09.628117 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-fv5dz"] Mar 18 16:53:09.628634 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:09.628572 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="147c9854-cf8e-485f-96f7-31e50cdd7c34" containerName="pull" Mar 18 16:53:09.628634 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:09.628591 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="147c9854-cf8e-485f-96f7-31e50cdd7c34" containerName="pull" Mar 18 16:53:09.628634 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:09.628606 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="147c9854-cf8e-485f-96f7-31e50cdd7c34" containerName="extract" Mar 18 16:53:09.628634 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:09.628616 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="147c9854-cf8e-485f-96f7-31e50cdd7c34" containerName="extract" Mar 18 16:53:09.628634 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:09.628627 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="147c9854-cf8e-485f-96f7-31e50cdd7c34" containerName="util" Mar 18 16:53:09.628634 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:09.628637 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="147c9854-cf8e-485f-96f7-31e50cdd7c34" containerName="util" Mar 18 16:53:09.628910 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:09.628717 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="147c9854-cf8e-485f-96f7-31e50cdd7c34" containerName="extract" Mar 18 16:53:09.631952 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:09.631913 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fv5dz" Mar 18 16:53:09.634606 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:09.634579 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:53:09.634606 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:09.634593 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Mar 18 16:53:09.635619 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:09.635603 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-9g5l6\"" Mar 18 16:53:09.639951 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:09.639914 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-fv5dz"] Mar 18 16:53:09.657265 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:09.657231 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f8917ccf-5984-4b2b-880b-c489ba5944a6-tmp\") pod \"openshift-lws-operator-bfc7f696d-fv5dz\" (UID: \"f8917ccf-5984-4b2b-880b-c489ba5944a6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fv5dz" Mar 18 16:53:09.657422 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:09.657288 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb7ph\" (UniqueName: \"kubernetes.io/projected/f8917ccf-5984-4b2b-880b-c489ba5944a6-kube-api-access-wb7ph\") pod \"openshift-lws-operator-bfc7f696d-fv5dz\" (UID: \"f8917ccf-5984-4b2b-880b-c489ba5944a6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fv5dz" Mar 18 16:53:09.757630 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:09.757587 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f8917ccf-5984-4b2b-880b-c489ba5944a6-tmp\") pod \"openshift-lws-operator-bfc7f696d-fv5dz\" (UID: \"f8917ccf-5984-4b2b-880b-c489ba5944a6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fv5dz" Mar 18 16:53:09.757805 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:09.757643 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wb7ph\" (UniqueName: \"kubernetes.io/projected/f8917ccf-5984-4b2b-880b-c489ba5944a6-kube-api-access-wb7ph\") pod \"openshift-lws-operator-bfc7f696d-fv5dz\" (UID: \"f8917ccf-5984-4b2b-880b-c489ba5944a6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fv5dz" Mar 18 16:53:09.758056 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:09.758031 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f8917ccf-5984-4b2b-880b-c489ba5944a6-tmp\") pod \"openshift-lws-operator-bfc7f696d-fv5dz\" (UID: \"f8917ccf-5984-4b2b-880b-c489ba5944a6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fv5dz" Mar 18 16:53:09.765887 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:09.765860 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb7ph\" (UniqueName: \"kubernetes.io/projected/f8917ccf-5984-4b2b-880b-c489ba5944a6-kube-api-access-wb7ph\") pod \"openshift-lws-operator-bfc7f696d-fv5dz\" (UID: \"f8917ccf-5984-4b2b-880b-c489ba5944a6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fv5dz" Mar 18 16:53:09.941912 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:09.941874 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fv5dz" Mar 18 16:53:10.070129 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:10.070102 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-fv5dz"] Mar 18 16:53:10.072902 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:53:10.072871 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8917ccf_5984_4b2b_880b_c489ba5944a6.slice/crio-bc84364c7935e3c21531ea25233f87a46d10f10f001b0eb33950defa8ac7276b WatchSource:0}: Error finding container bc84364c7935e3c21531ea25233f87a46d10f10f001b0eb33950defa8ac7276b: Status 404 returned error can't find the container with id bc84364c7935e3c21531ea25233f87a46d10f10f001b0eb33950defa8ac7276b Mar 18 16:53:10.420094 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:10.419986 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fv5dz" event={"ID":"f8917ccf-5984-4b2b-880b-c489ba5944a6","Type":"ContainerStarted","Data":"bc84364c7935e3c21531ea25233f87a46d10f10f001b0eb33950defa8ac7276b"} Mar 18 16:53:12.428449 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:12.428414 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fv5dz" event={"ID":"f8917ccf-5984-4b2b-880b-c489ba5944a6","Type":"ContainerStarted","Data":"64cc0e2e56826d1c2bb6578c8cb6c39102c5aa9439af35414b5a65a1027bdf1b"} Mar 18 16:53:12.452317 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:12.452262 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fv5dz" podStartSLOduration=1.164245902 podStartE2EDuration="3.45224677s" podCreationTimestamp="2026-03-18 16:53:09 +0000 UTC" firstStartedPulling="2026-03-18 16:53:10.074299637 +0000 UTC m=+517.789722616" lastFinishedPulling="2026-03-18 16:53:12.362300505 +0000 UTC m=+520.077723484" observedRunningTime="2026-03-18 16:53:12.45135321 +0000 UTC m=+520.166776226" watchObservedRunningTime="2026-03-18 16:53:12.45224677 +0000 UTC m=+520.167669767" Mar 18 16:53:18.591862 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:18.591819 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7"] Mar 18 16:53:18.595376 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:18.595345 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7" Mar 18 16:53:18.598039 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:18.598015 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Mar 18 16:53:18.598196 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:18.598019 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Mar 18 16:53:18.598971 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:18.598956 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lptfs\"" Mar 18 16:53:18.604412 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:18.604385 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7"] Mar 18 16:53:18.629096 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:18.629058 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2ce27e5-77d0-43e7-964f-eba2fddb79a4-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7\" (UID: \"a2ce27e5-77d0-43e7-964f-eba2fddb79a4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7" Mar 18 16:53:18.629239 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:18.629149 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62rjv\" (UniqueName: \"kubernetes.io/projected/a2ce27e5-77d0-43e7-964f-eba2fddb79a4-kube-api-access-62rjv\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7\" (UID: \"a2ce27e5-77d0-43e7-964f-eba2fddb79a4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7" Mar 18 16:53:18.629239 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:18.629175 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2ce27e5-77d0-43e7-964f-eba2fddb79a4-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7\" (UID: \"a2ce27e5-77d0-43e7-964f-eba2fddb79a4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7" Mar 18 16:53:18.730274 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:18.730235 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2ce27e5-77d0-43e7-964f-eba2fddb79a4-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7\" (UID: \"a2ce27e5-77d0-43e7-964f-eba2fddb79a4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7" Mar 18 16:53:18.730450 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:18.730303 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62rjv\" (UniqueName: \"kubernetes.io/projected/a2ce27e5-77d0-43e7-964f-eba2fddb79a4-kube-api-access-62rjv\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7\" (UID: \"a2ce27e5-77d0-43e7-964f-eba2fddb79a4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7" Mar 18 16:53:18.730450 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:18.730326 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2ce27e5-77d0-43e7-964f-eba2fddb79a4-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7\" (UID: \"a2ce27e5-77d0-43e7-964f-eba2fddb79a4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7" Mar 18 16:53:18.730637 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:18.730619 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2ce27e5-77d0-43e7-964f-eba2fddb79a4-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7\" (UID: \"a2ce27e5-77d0-43e7-964f-eba2fddb79a4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7" Mar 18 16:53:18.730685 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:18.730663 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2ce27e5-77d0-43e7-964f-eba2fddb79a4-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7\" (UID: \"a2ce27e5-77d0-43e7-964f-eba2fddb79a4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7" Mar 18 16:53:18.739099 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:18.739068 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62rjv\" (UniqueName: \"kubernetes.io/projected/a2ce27e5-77d0-43e7-964f-eba2fddb79a4-kube-api-access-62rjv\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7\" (UID: \"a2ce27e5-77d0-43e7-964f-eba2fddb79a4\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7" Mar 18 16:53:18.906162 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:18.906062 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7" Mar 18 16:53:19.038058 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:19.038031 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7"] Mar 18 16:53:19.040635 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:53:19.040594 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2ce27e5_77d0_43e7_964f_eba2fddb79a4.slice/crio-7ae4ad1ffdb4c0c40b2c67734e0df9533852476bd47ea86632044c8d794d444f WatchSource:0}: Error finding container 7ae4ad1ffdb4c0c40b2c67734e0df9533852476bd47ea86632044c8d794d444f: Status 404 returned error can't find the container with id 7ae4ad1ffdb4c0c40b2c67734e0df9533852476bd47ea86632044c8d794d444f Mar 18 16:53:19.452528 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:19.452488 2573 generic.go:358] "Generic (PLEG): container finished" podID="a2ce27e5-77d0-43e7-964f-eba2fddb79a4" containerID="1bc28c00097c123c0dd1b40d671569043c00758cc3ce6f86df98e95faf87c7ca" exitCode=0 Mar 18 16:53:19.452706 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:19.452575 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7" event={"ID":"a2ce27e5-77d0-43e7-964f-eba2fddb79a4","Type":"ContainerDied","Data":"1bc28c00097c123c0dd1b40d671569043c00758cc3ce6f86df98e95faf87c7ca"} Mar 18 16:53:19.452706 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:19.452615 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7" event={"ID":"a2ce27e5-77d0-43e7-964f-eba2fddb79a4","Type":"ContainerStarted","Data":"7ae4ad1ffdb4c0c40b2c67734e0df9533852476bd47ea86632044c8d794d444f"} Mar 18 16:53:20.458103 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:20.458071 2573 generic.go:358] "Generic (PLEG): container finished" podID="a2ce27e5-77d0-43e7-964f-eba2fddb79a4" containerID="dbb335a64c2603982d4a982fc7e91aedffb5e9c9a660968ba24c20fe6adcb8ab" exitCode=0 Mar 18 16:53:20.458506 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:20.458154 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7" event={"ID":"a2ce27e5-77d0-43e7-964f-eba2fddb79a4","Type":"ContainerDied","Data":"dbb335a64c2603982d4a982fc7e91aedffb5e9c9a660968ba24c20fe6adcb8ab"} Mar 18 16:53:21.463068 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:21.463030 2573 generic.go:358] "Generic (PLEG): container finished" podID="a2ce27e5-77d0-43e7-964f-eba2fddb79a4" containerID="308ebde92bc3faf98d683e16f03f55bcce06f33ff27711e97ff6ed81b8effcac" exitCode=0 Mar 18 16:53:21.463520 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:21.463074 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7" event={"ID":"a2ce27e5-77d0-43e7-964f-eba2fddb79a4","Type":"ContainerDied","Data":"308ebde92bc3faf98d683e16f03f55bcce06f33ff27711e97ff6ed81b8effcac"} Mar 18 16:53:22.593121 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:22.593093 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7" Mar 18 16:53:22.662872 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:22.662833 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2ce27e5-77d0-43e7-964f-eba2fddb79a4-util\") pod \"a2ce27e5-77d0-43e7-964f-eba2fddb79a4\" (UID: \"a2ce27e5-77d0-43e7-964f-eba2fddb79a4\") " Mar 18 16:53:22.662872 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:22.662871 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62rjv\" (UniqueName: \"kubernetes.io/projected/a2ce27e5-77d0-43e7-964f-eba2fddb79a4-kube-api-access-62rjv\") pod \"a2ce27e5-77d0-43e7-964f-eba2fddb79a4\" (UID: \"a2ce27e5-77d0-43e7-964f-eba2fddb79a4\") " Mar 18 16:53:22.663156 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:22.662897 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2ce27e5-77d0-43e7-964f-eba2fddb79a4-bundle\") pod \"a2ce27e5-77d0-43e7-964f-eba2fddb79a4\" (UID: \"a2ce27e5-77d0-43e7-964f-eba2fddb79a4\") " Mar 18 16:53:22.663718 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:22.663690 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2ce27e5-77d0-43e7-964f-eba2fddb79a4-bundle" (OuterVolumeSpecName: "bundle") pod "a2ce27e5-77d0-43e7-964f-eba2fddb79a4" (UID: "a2ce27e5-77d0-43e7-964f-eba2fddb79a4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:53:22.664952 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:22.664917 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2ce27e5-77d0-43e7-964f-eba2fddb79a4-kube-api-access-62rjv" (OuterVolumeSpecName: "kube-api-access-62rjv") pod "a2ce27e5-77d0-43e7-964f-eba2fddb79a4" (UID: "a2ce27e5-77d0-43e7-964f-eba2fddb79a4"). InnerVolumeSpecName "kube-api-access-62rjv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:53:22.670136 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:22.670100 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2ce27e5-77d0-43e7-964f-eba2fddb79a4-util" (OuterVolumeSpecName: "util") pod "a2ce27e5-77d0-43e7-964f-eba2fddb79a4" (UID: "a2ce27e5-77d0-43e7-964f-eba2fddb79a4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:53:22.764172 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:22.764080 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2ce27e5-77d0-43e7-964f-eba2fddb79a4-util\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:53:22.764172 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:22.764115 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-62rjv\" (UniqueName: \"kubernetes.io/projected/a2ce27e5-77d0-43e7-964f-eba2fddb79a4-kube-api-access-62rjv\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:53:22.764172 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:22.764128 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2ce27e5-77d0-43e7-964f-eba2fddb79a4-bundle\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:53:23.471661 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:23.471624 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7" event={"ID":"a2ce27e5-77d0-43e7-964f-eba2fddb79a4","Type":"ContainerDied","Data":"7ae4ad1ffdb4c0c40b2c67734e0df9533852476bd47ea86632044c8d794d444f"} Mar 18 16:53:23.471661 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:23.471664 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ae4ad1ffdb4c0c40b2c67734e0df9533852476bd47ea86632044c8d794d444f" Mar 18 16:53:23.471866 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:23.471686 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cx8p7" Mar 18 16:53:28.523654 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.523615 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6"] Mar 18 16:53:28.524059 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.523935 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2ce27e5-77d0-43e7-964f-eba2fddb79a4" containerName="pull" Mar 18 16:53:28.524059 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.523981 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2ce27e5-77d0-43e7-964f-eba2fddb79a4" containerName="pull" Mar 18 16:53:28.524059 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.524004 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2ce27e5-77d0-43e7-964f-eba2fddb79a4" containerName="extract" Mar 18 16:53:28.524059 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.524013 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2ce27e5-77d0-43e7-964f-eba2fddb79a4" containerName="extract" Mar 18 16:53:28.524059 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.524025 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2ce27e5-77d0-43e7-964f-eba2fddb79a4" containerName="util" Mar 18 16:53:28.524059 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.524030 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2ce27e5-77d0-43e7-964f-eba2fddb79a4" containerName="util" Mar 18 16:53:28.524241 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.524113 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2ce27e5-77d0-43e7-964f-eba2fddb79a4" containerName="extract" Mar 18 16:53:28.528389 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.528369 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6" Mar 18 16:53:28.531634 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.531608 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lptfs\"" Mar 18 16:53:28.532702 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.532683 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Mar 18 16:53:28.532767 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.532727 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Mar 18 16:53:28.543646 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.543613 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6"] Mar 18 16:53:28.610650 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.610615 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e4de1a44-ae9d-44cb-a1dd-8a08834fbeda-bundle\") pod \"314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6\" (UID: \"e4de1a44-ae9d-44cb-a1dd-8a08834fbeda\") " pod="openshift-marketplace/314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6" Mar 18 16:53:28.610841 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.610682 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp76j\" (UniqueName: \"kubernetes.io/projected/e4de1a44-ae9d-44cb-a1dd-8a08834fbeda-kube-api-access-lp76j\") pod \"314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6\" (UID: \"e4de1a44-ae9d-44cb-a1dd-8a08834fbeda\") " pod="openshift-marketplace/314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6" Mar 18 16:53:28.610841 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.610731 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e4de1a44-ae9d-44cb-a1dd-8a08834fbeda-util\") pod \"314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6\" (UID: \"e4de1a44-ae9d-44cb-a1dd-8a08834fbeda\") " pod="openshift-marketplace/314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6" Mar 18 16:53:28.712161 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.712102 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lp76j\" (UniqueName: \"kubernetes.io/projected/e4de1a44-ae9d-44cb-a1dd-8a08834fbeda-kube-api-access-lp76j\") pod \"314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6\" (UID: \"e4de1a44-ae9d-44cb-a1dd-8a08834fbeda\") " pod="openshift-marketplace/314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6" Mar 18 16:53:28.712161 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.712165 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e4de1a44-ae9d-44cb-a1dd-8a08834fbeda-util\") pod \"314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6\" (UID: \"e4de1a44-ae9d-44cb-a1dd-8a08834fbeda\") " pod="openshift-marketplace/314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6" Mar 18 16:53:28.712383 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.712207 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e4de1a44-ae9d-44cb-a1dd-8a08834fbeda-bundle\") pod \"314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6\" (UID: \"e4de1a44-ae9d-44cb-a1dd-8a08834fbeda\") " pod="openshift-marketplace/314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6" Mar 18 16:53:28.712659 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.712633 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e4de1a44-ae9d-44cb-a1dd-8a08834fbeda-util\") pod \"314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6\" (UID: \"e4de1a44-ae9d-44cb-a1dd-8a08834fbeda\") " pod="openshift-marketplace/314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6" Mar 18 16:53:28.712712 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.712696 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e4de1a44-ae9d-44cb-a1dd-8a08834fbeda-bundle\") pod \"314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6\" (UID: \"e4de1a44-ae9d-44cb-a1dd-8a08834fbeda\") " pod="openshift-marketplace/314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6" Mar 18 16:53:28.730169 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.730123 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp76j\" (UniqueName: \"kubernetes.io/projected/e4de1a44-ae9d-44cb-a1dd-8a08834fbeda-kube-api-access-lp76j\") pod \"314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6\" (UID: \"e4de1a44-ae9d-44cb-a1dd-8a08834fbeda\") " pod="openshift-marketplace/314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6" Mar 18 16:53:28.838022 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.837921 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6" Mar 18 16:53:28.982338 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:28.982239 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6"] Mar 18 16:53:28.984864 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:53:28.984829 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4de1a44_ae9d_44cb_a1dd_8a08834fbeda.slice/crio-5d879c06a49e8546cbc5d3db5fb16dc16c70315d0bd2bef7dd6be280e1ab7ec8 WatchSource:0}: Error finding container 5d879c06a49e8546cbc5d3db5fb16dc16c70315d0bd2bef7dd6be280e1ab7ec8: Status 404 returned error can't find the container with id 5d879c06a49e8546cbc5d3db5fb16dc16c70315d0bd2bef7dd6be280e1ab7ec8 Mar 18 16:53:29.493685 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:29.493650 2573 generic.go:358] "Generic (PLEG): container finished" podID="e4de1a44-ae9d-44cb-a1dd-8a08834fbeda" containerID="eeb30ad21ed83d7810f160d04ae72ef3049629fe9b887b63a14c587b6693c959" exitCode=0 Mar 18 16:53:29.493864 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:29.493730 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6" event={"ID":"e4de1a44-ae9d-44cb-a1dd-8a08834fbeda","Type":"ContainerDied","Data":"eeb30ad21ed83d7810f160d04ae72ef3049629fe9b887b63a14c587b6693c959"} Mar 18 16:53:29.493864 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:29.493767 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6" event={"ID":"e4de1a44-ae9d-44cb-a1dd-8a08834fbeda","Type":"ContainerStarted","Data":"5d879c06a49e8546cbc5d3db5fb16dc16c70315d0bd2bef7dd6be280e1ab7ec8"} Mar 18 16:53:29.866394 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:29.866297 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-gxns9"] Mar 18 16:53:29.869693 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:29.869669 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gxns9" Mar 18 16:53:29.872641 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:29.872614 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Mar 18 16:53:29.872776 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:29.872646 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-n629t\"" Mar 18 16:53:29.872776 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:29.872748 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Mar 18 16:53:29.888027 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:29.887993 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-gxns9"] Mar 18 16:53:29.919839 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:29.919802 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5dnc\" (UniqueName: \"kubernetes.io/projected/85ab4b77-b9a9-4a66-bae6-1f90d1bde79e-kube-api-access-j5dnc\") pod \"servicemesh-operator3-55f49c5f94-gxns9\" (UID: \"85ab4b77-b9a9-4a66-bae6-1f90d1bde79e\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gxns9" Mar 18 16:53:29.920025 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:29.919862 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/85ab4b77-b9a9-4a66-bae6-1f90d1bde79e-operator-config\") pod \"servicemesh-operator3-55f49c5f94-gxns9\" (UID: \"85ab4b77-b9a9-4a66-bae6-1f90d1bde79e\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gxns9" Mar 18 16:53:30.021405 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:30.021367 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5dnc\" (UniqueName: \"kubernetes.io/projected/85ab4b77-b9a9-4a66-bae6-1f90d1bde79e-kube-api-access-j5dnc\") pod \"servicemesh-operator3-55f49c5f94-gxns9\" (UID: \"85ab4b77-b9a9-4a66-bae6-1f90d1bde79e\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gxns9" Mar 18 16:53:30.021575 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:30.021416 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/85ab4b77-b9a9-4a66-bae6-1f90d1bde79e-operator-config\") pod \"servicemesh-operator3-55f49c5f94-gxns9\" (UID: \"85ab4b77-b9a9-4a66-bae6-1f90d1bde79e\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gxns9" Mar 18 16:53:30.023909 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:30.023877 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/85ab4b77-b9a9-4a66-bae6-1f90d1bde79e-operator-config\") pod \"servicemesh-operator3-55f49c5f94-gxns9\" (UID: \"85ab4b77-b9a9-4a66-bae6-1f90d1bde79e\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gxns9" Mar 18 16:53:30.029983 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:30.029958 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5dnc\" (UniqueName: \"kubernetes.io/projected/85ab4b77-b9a9-4a66-bae6-1f90d1bde79e-kube-api-access-j5dnc\") pod \"servicemesh-operator3-55f49c5f94-gxns9\" (UID: \"85ab4b77-b9a9-4a66-bae6-1f90d1bde79e\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gxns9" Mar 18 16:53:30.179855 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:30.179818 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gxns9" Mar 18 16:53:30.318900 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:30.318869 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-gxns9"] Mar 18 16:53:30.323377 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:53:30.323340 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85ab4b77_b9a9_4a66_bae6_1f90d1bde79e.slice/crio-4b60414350ca5d53082be2d02326960ab0760f15db0de20c58ec68ec8ae9e396 WatchSource:0}: Error finding container 4b60414350ca5d53082be2d02326960ab0760f15db0de20c58ec68ec8ae9e396: Status 404 returned error can't find the container with id 4b60414350ca5d53082be2d02326960ab0760f15db0de20c58ec68ec8ae9e396 Mar 18 16:53:30.498754 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:30.498661 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gxns9" event={"ID":"85ab4b77-b9a9-4a66-bae6-1f90d1bde79e","Type":"ContainerStarted","Data":"4b60414350ca5d53082be2d02326960ab0760f15db0de20c58ec68ec8ae9e396"} Mar 18 16:53:31.504860 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:31.504813 2573 generic.go:358] "Generic (PLEG): container finished" podID="e4de1a44-ae9d-44cb-a1dd-8a08834fbeda" containerID="b51d7faa0dd5d544e187ddbf7ccb8d6ffa9857f94fd5926ce01d53c0bdb4ce98" exitCode=0 Mar 18 16:53:31.505359 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:31.504902 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6" event={"ID":"e4de1a44-ae9d-44cb-a1dd-8a08834fbeda","Type":"ContainerDied","Data":"b51d7faa0dd5d544e187ddbf7ccb8d6ffa9857f94fd5926ce01d53c0bdb4ce98"} Mar 18 16:53:32.513096 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:32.513051 2573 generic.go:358] "Generic (PLEG): container finished" podID="e4de1a44-ae9d-44cb-a1dd-8a08834fbeda" containerID="3645af02883346ab47256279f807534134924c74430cf878923eeffda81dbb9f" exitCode=0 Mar 18 16:53:32.513511 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:32.513130 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6" event={"ID":"e4de1a44-ae9d-44cb-a1dd-8a08834fbeda","Type":"ContainerDied","Data":"3645af02883346ab47256279f807534134924c74430cf878923eeffda81dbb9f"} Mar 18 16:53:33.518525 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:33.518490 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gxns9" event={"ID":"85ab4b77-b9a9-4a66-bae6-1f90d1bde79e","Type":"ContainerStarted","Data":"c377bd981b660d5442b0d18c8ad72b5c2561ff17e14ace1545768d79346f76fe"} Mar 18 16:53:33.518983 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:33.518552 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gxns9" Mar 18 16:53:33.543352 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:33.543279 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gxns9" podStartSLOduration=2.001879676 podStartE2EDuration="4.54325761s" podCreationTimestamp="2026-03-18 16:53:29 +0000 UTC" firstStartedPulling="2026-03-18 16:53:30.326042128 +0000 UTC m=+538.041465118" lastFinishedPulling="2026-03-18 16:53:32.867420061 +0000 UTC m=+540.582843052" observedRunningTime="2026-03-18 16:53:33.540181964 +0000 UTC m=+541.255604963" watchObservedRunningTime="2026-03-18 16:53:33.54325761 +0000 UTC m=+541.258680612" Mar 18 16:53:33.656507 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:33.656485 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6" Mar 18 16:53:33.749235 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:33.749192 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e4de1a44-ae9d-44cb-a1dd-8a08834fbeda-util\") pod \"e4de1a44-ae9d-44cb-a1dd-8a08834fbeda\" (UID: \"e4de1a44-ae9d-44cb-a1dd-8a08834fbeda\") " Mar 18 16:53:33.749445 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:33.749299 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp76j\" (UniqueName: \"kubernetes.io/projected/e4de1a44-ae9d-44cb-a1dd-8a08834fbeda-kube-api-access-lp76j\") pod \"e4de1a44-ae9d-44cb-a1dd-8a08834fbeda\" (UID: \"e4de1a44-ae9d-44cb-a1dd-8a08834fbeda\") " Mar 18 16:53:33.749445 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:33.749323 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e4de1a44-ae9d-44cb-a1dd-8a08834fbeda-bundle\") pod \"e4de1a44-ae9d-44cb-a1dd-8a08834fbeda\" (UID: \"e4de1a44-ae9d-44cb-a1dd-8a08834fbeda\") " Mar 18 16:53:33.750251 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:33.750223 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4de1a44-ae9d-44cb-a1dd-8a08834fbeda-bundle" (OuterVolumeSpecName: "bundle") pod "e4de1a44-ae9d-44cb-a1dd-8a08834fbeda" (UID: "e4de1a44-ae9d-44cb-a1dd-8a08834fbeda"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:53:33.751512 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:33.751486 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4de1a44-ae9d-44cb-a1dd-8a08834fbeda-kube-api-access-lp76j" (OuterVolumeSpecName: "kube-api-access-lp76j") pod "e4de1a44-ae9d-44cb-a1dd-8a08834fbeda" (UID: "e4de1a44-ae9d-44cb-a1dd-8a08834fbeda"). InnerVolumeSpecName "kube-api-access-lp76j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:53:33.757776 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:33.757717 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4de1a44-ae9d-44cb-a1dd-8a08834fbeda-util" (OuterVolumeSpecName: "util") pod "e4de1a44-ae9d-44cb-a1dd-8a08834fbeda" (UID: "e4de1a44-ae9d-44cb-a1dd-8a08834fbeda"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:53:33.850670 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:33.850578 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lp76j\" (UniqueName: \"kubernetes.io/projected/e4de1a44-ae9d-44cb-a1dd-8a08834fbeda-kube-api-access-lp76j\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:53:33.850670 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:33.850614 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e4de1a44-ae9d-44cb-a1dd-8a08834fbeda-bundle\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:53:33.850670 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:33.850625 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e4de1a44-ae9d-44cb-a1dd-8a08834fbeda-util\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:53:34.525487 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:34.525462 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6" Mar 18 16:53:34.525487 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:34.525470 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/314e757ec11e35dec7dd7fd130e91ee3ae1a5478d22aa334148a4db1deqckg6" event={"ID":"e4de1a44-ae9d-44cb-a1dd-8a08834fbeda","Type":"ContainerDied","Data":"5d879c06a49e8546cbc5d3db5fb16dc16c70315d0bd2bef7dd6be280e1ab7ec8"} Mar 18 16:53:34.526045 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:34.525499 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d879c06a49e8546cbc5d3db5fb16dc16c70315d0bd2bef7dd6be280e1ab7ec8" Mar 18 16:53:39.872501 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:39.872467 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv"] Mar 18 16:53:39.872906 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:39.872779 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4de1a44-ae9d-44cb-a1dd-8a08834fbeda" containerName="util" Mar 18 16:53:39.872906 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:39.872790 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4de1a44-ae9d-44cb-a1dd-8a08834fbeda" containerName="util" Mar 18 16:53:39.872906 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:39.872807 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4de1a44-ae9d-44cb-a1dd-8a08834fbeda" containerName="extract" Mar 18 16:53:39.872906 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:39.872813 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4de1a44-ae9d-44cb-a1dd-8a08834fbeda" containerName="extract" Mar 18 16:53:39.872906 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:39.872823 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4de1a44-ae9d-44cb-a1dd-8a08834fbeda" containerName="pull" Mar 18 16:53:39.872906 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:39.872829 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4de1a44-ae9d-44cb-a1dd-8a08834fbeda" containerName="pull" Mar 18 16:53:39.872906 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:39.872888 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e4de1a44-ae9d-44cb-a1dd-8a08834fbeda" containerName="extract" Mar 18 16:53:39.876996 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:39.876974 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:39.880330 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:39.880306 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Mar 18 16:53:39.880684 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:39.880668 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Mar 18 16:53:39.881037 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:39.881020 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Mar 18 16:53:39.881170 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:39.881147 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Mar 18 16:53:39.881281 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:39.881238 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-5cm85\"" Mar 18 16:53:39.881792 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:39.881774 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Mar 18 16:53:39.882183 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:39.882167 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Mar 18 16:53:39.892371 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:39.892346 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv"] Mar 18 16:53:40.000304 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.000260 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-tt2lv\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:40.000491 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.000388 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/9c291e94-0e6d-40da-ab84-0be9016c5885-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-tt2lv\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:40.000491 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.000429 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-tt2lv\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:40.000491 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.000459 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcwcv\" (UniqueName: \"kubernetes.io/projected/9c291e94-0e6d-40da-ab84-0be9016c5885-kube-api-access-rcwcv\") pod \"istiod-openshift-gateway-7cd77c7ffd-tt2lv\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:40.000687 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.000665 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-tt2lv\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:40.000754 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.000736 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-tt2lv\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:40.000804 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.000789 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/9c291e94-0e6d-40da-ab84-0be9016c5885-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-tt2lv\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:40.102115 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.102064 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-tt2lv\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:40.102318 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.102135 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/9c291e94-0e6d-40da-ab84-0be9016c5885-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-tt2lv\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:40.102318 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.102168 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-tt2lv\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:40.102318 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.102222 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/9c291e94-0e6d-40da-ab84-0be9016c5885-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-tt2lv\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:40.102318 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.102274 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-tt2lv\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:40.102318 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.102303 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rcwcv\" (UniqueName: \"kubernetes.io/projected/9c291e94-0e6d-40da-ab84-0be9016c5885-kube-api-access-rcwcv\") pod \"istiod-openshift-gateway-7cd77c7ffd-tt2lv\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:40.102540 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.102359 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-tt2lv\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:40.103160 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.103129 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-tt2lv\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:40.105181 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.104995 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-tt2lv\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:40.105288 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.105263 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/9c291e94-0e6d-40da-ab84-0be9016c5885-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-tt2lv\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:40.105356 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.105286 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-tt2lv\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:40.105739 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.105708 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/9c291e94-0e6d-40da-ab84-0be9016c5885-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-tt2lv\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:40.110929 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.110898 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-tt2lv\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:40.111133 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.111116 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcwcv\" (UniqueName: \"kubernetes.io/projected/9c291e94-0e6d-40da-ab84-0be9016c5885-kube-api-access-rcwcv\") pod \"istiod-openshift-gateway-7cd77c7ffd-tt2lv\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:40.188143 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.188107 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:40.332233 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.332206 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv"] Mar 18 16:53:40.338502 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:53:40.338471 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c291e94_0e6d_40da_ab84_0be9016c5885.slice/crio-43115f1c85ad96576c500e01640034c4c9841ddf48ebb88e308cbe2a9ab8652d WatchSource:0}: Error finding container 43115f1c85ad96576c500e01640034c4c9841ddf48ebb88e308cbe2a9ab8652d: Status 404 returned error can't find the container with id 43115f1c85ad96576c500e01640034c4c9841ddf48ebb88e308cbe2a9ab8652d Mar 18 16:53:40.548587 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:40.548499 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" event={"ID":"9c291e94-0e6d-40da-ab84-0be9016c5885","Type":"ContainerStarted","Data":"43115f1c85ad96576c500e01640034c4c9841ddf48ebb88e308cbe2a9ab8652d"} Mar 18 16:53:42.760053 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:42.760015 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Mar 18 16:53:42.760311 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:42.760085 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Mar 18 16:53:43.566009 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:43.565968 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" event={"ID":"9c291e94-0e6d-40da-ab84-0be9016c5885","Type":"ContainerStarted","Data":"d86b5b32197310d664280a907614f3c4f25e520877da3175188b49ccabb47a6e"} Mar 18 16:53:43.566194 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:43.566110 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:43.567802 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:43.567775 2573 patch_prober.go:28] interesting pod/istiod-openshift-gateway-7cd77c7ffd-tt2lv container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Mar 18 16:53:43.567889 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:43.567818 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" podUID="9c291e94-0e6d-40da-ab84-0be9016c5885" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:53:43.589041 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:43.588976 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" podStartSLOduration=2.169725987 podStartE2EDuration="4.588955408s" podCreationTimestamp="2026-03-18 16:53:39 +0000 UTC" firstStartedPulling="2026-03-18 16:53:40.340533527 +0000 UTC m=+548.055956503" lastFinishedPulling="2026-03-18 16:53:42.759762947 +0000 UTC m=+550.475185924" observedRunningTime="2026-03-18 16:53:43.586979468 +0000 UTC m=+551.302402465" watchObservedRunningTime="2026-03-18 16:53:43.588955408 +0000 UTC m=+551.304378398" Mar 18 16:53:44.527921 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:44.527889 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gxns9" Mar 18 16:53:44.570609 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:44.570583 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:53:55.324336 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.324294 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp"] Mar 18 16:53:55.328446 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.328426 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp" Mar 18 16:53:55.331500 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.331474 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Mar 18 16:53:55.331613 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.331542 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Mar 18 16:53:55.332713 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.332696 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lptfs\"" Mar 18 16:53:55.339007 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.338977 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp"] Mar 18 16:53:55.426497 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.426455 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n"] Mar 18 16:53:55.430109 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.430090 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n" Mar 18 16:53:55.436959 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.436909 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd24z\" (UniqueName: \"kubernetes.io/projected/ab965fdc-f9c6-4e4f-982f-5b26385e1ed6-kube-api-access-qd24z\") pod \"5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp\" (UID: \"ab965fdc-f9c6-4e4f-982f-5b26385e1ed6\") " pod="openshift-marketplace/5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp" Mar 18 16:53:55.437088 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.437067 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab965fdc-f9c6-4e4f-982f-5b26385e1ed6-util\") pod \"5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp\" (UID: \"ab965fdc-f9c6-4e4f-982f-5b26385e1ed6\") " pod="openshift-marketplace/5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp" Mar 18 16:53:55.437137 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.437105 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab965fdc-f9c6-4e4f-982f-5b26385e1ed6-bundle\") pod \"5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp\" (UID: \"ab965fdc-f9c6-4e4f-982f-5b26385e1ed6\") " pod="openshift-marketplace/5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp" Mar 18 16:53:55.439215 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.439191 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n"] Mar 18 16:53:55.525257 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.525220 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd"] Mar 18 16:53:55.529038 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.529012 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd" Mar 18 16:53:55.537594 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.537568 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd"] Mar 18 16:53:55.537925 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.537884 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8376f4ed-27af-4e64-b2c7-ef984398bdc1-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n\" (UID: \"8376f4ed-27af-4e64-b2c7-ef984398bdc1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n" Mar 18 16:53:55.538040 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.537952 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8376f4ed-27af-4e64-b2c7-ef984398bdc1-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n\" (UID: \"8376f4ed-27af-4e64-b2c7-ef984398bdc1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n" Mar 18 16:53:55.538040 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.538034 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab965fdc-f9c6-4e4f-982f-5b26385e1ed6-util\") pod \"5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp\" (UID: \"ab965fdc-f9c6-4e4f-982f-5b26385e1ed6\") " pod="openshift-marketplace/5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp" Mar 18 16:53:55.538161 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.538068 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab965fdc-f9c6-4e4f-982f-5b26385e1ed6-bundle\") pod \"5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp\" (UID: \"ab965fdc-f9c6-4e4f-982f-5b26385e1ed6\") " pod="openshift-marketplace/5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp" Mar 18 16:53:55.538161 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.538125 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qd24z\" (UniqueName: \"kubernetes.io/projected/ab965fdc-f9c6-4e4f-982f-5b26385e1ed6-kube-api-access-qd24z\") pod \"5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp\" (UID: \"ab965fdc-f9c6-4e4f-982f-5b26385e1ed6\") " pod="openshift-marketplace/5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp" Mar 18 16:53:55.538259 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.538180 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8dcn\" (UniqueName: \"kubernetes.io/projected/8376f4ed-27af-4e64-b2c7-ef984398bdc1-kube-api-access-f8dcn\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n\" (UID: \"8376f4ed-27af-4e64-b2c7-ef984398bdc1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n" Mar 18 16:53:55.538481 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.538455 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab965fdc-f9c6-4e4f-982f-5b26385e1ed6-util\") pod \"5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp\" (UID: \"ab965fdc-f9c6-4e4f-982f-5b26385e1ed6\") " pod="openshift-marketplace/5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp" Mar 18 16:53:55.538558 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.538499 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab965fdc-f9c6-4e4f-982f-5b26385e1ed6-bundle\") pod \"5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp\" (UID: \"ab965fdc-f9c6-4e4f-982f-5b26385e1ed6\") " pod="openshift-marketplace/5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp" Mar 18 16:53:55.546007 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.545960 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd24z\" (UniqueName: \"kubernetes.io/projected/ab965fdc-f9c6-4e4f-982f-5b26385e1ed6-kube-api-access-qd24z\") pod \"5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp\" (UID: \"ab965fdc-f9c6-4e4f-982f-5b26385e1ed6\") " pod="openshift-marketplace/5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp" Mar 18 16:53:55.626027 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.625929 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg"] Mar 18 16:53:55.629745 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.629723 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg" Mar 18 16:53:55.637844 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.637818 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg"] Mar 18 16:53:55.639213 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.639189 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp" Mar 18 16:53:55.639444 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.639418 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e11dfcd9-ee4a-4e94-9beb-94895e113198-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd\" (UID: \"e11dfcd9-ee4a-4e94-9beb-94895e113198\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd" Mar 18 16:53:55.639543 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.639461 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xprbb\" (UniqueName: \"kubernetes.io/projected/e11dfcd9-ee4a-4e94-9beb-94895e113198-kube-api-access-xprbb\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd\" (UID: \"e11dfcd9-ee4a-4e94-9beb-94895e113198\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd" Mar 18 16:53:55.639543 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.639500 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8376f4ed-27af-4e64-b2c7-ef984398bdc1-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n\" (UID: \"8376f4ed-27af-4e64-b2c7-ef984398bdc1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n" Mar 18 16:53:55.639667 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.639554 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8376f4ed-27af-4e64-b2c7-ef984398bdc1-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n\" (UID: \"8376f4ed-27af-4e64-b2c7-ef984398bdc1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n" Mar 18 16:53:55.639667 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.639614 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e11dfcd9-ee4a-4e94-9beb-94895e113198-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd\" (UID: \"e11dfcd9-ee4a-4e94-9beb-94895e113198\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd" Mar 18 16:53:55.639783 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.639733 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8dcn\" (UniqueName: \"kubernetes.io/projected/8376f4ed-27af-4e64-b2c7-ef984398bdc1-kube-api-access-f8dcn\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n\" (UID: \"8376f4ed-27af-4e64-b2c7-ef984398bdc1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n" Mar 18 16:53:55.639912 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.639885 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8376f4ed-27af-4e64-b2c7-ef984398bdc1-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n\" (UID: \"8376f4ed-27af-4e64-b2c7-ef984398bdc1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n" Mar 18 16:53:55.640332 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.640148 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8376f4ed-27af-4e64-b2c7-ef984398bdc1-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n\" (UID: \"8376f4ed-27af-4e64-b2c7-ef984398bdc1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n" Mar 18 16:53:55.648035 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.648007 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8dcn\" (UniqueName: \"kubernetes.io/projected/8376f4ed-27af-4e64-b2c7-ef984398bdc1-kube-api-access-f8dcn\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n\" (UID: \"8376f4ed-27af-4e64-b2c7-ef984398bdc1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n" Mar 18 16:53:55.740494 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.740460 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n" Mar 18 16:53:55.740689 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.740666 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e11dfcd9-ee4a-4e94-9beb-94895e113198-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd\" (UID: \"e11dfcd9-ee4a-4e94-9beb-94895e113198\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd" Mar 18 16:53:55.740746 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.740724 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e2db8bd-766f-4781-a4bb-1c09e35b116c-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg\" (UID: \"3e2db8bd-766f-4781-a4bb-1c09e35b116c\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg" Mar 18 16:53:55.740812 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.740794 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e11dfcd9-ee4a-4e94-9beb-94895e113198-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd\" (UID: \"e11dfcd9-ee4a-4e94-9beb-94895e113198\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd" Mar 18 16:53:55.740864 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.740829 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xprbb\" (UniqueName: \"kubernetes.io/projected/e11dfcd9-ee4a-4e94-9beb-94895e113198-kube-api-access-xprbb\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd\" (UID: \"e11dfcd9-ee4a-4e94-9beb-94895e113198\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd" Mar 18 16:53:55.740918 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.740880 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e2db8bd-766f-4781-a4bb-1c09e35b116c-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg\" (UID: \"3e2db8bd-766f-4781-a4bb-1c09e35b116c\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg" Mar 18 16:53:55.740918 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.740908 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6tzv\" (UniqueName: \"kubernetes.io/projected/3e2db8bd-766f-4781-a4bb-1c09e35b116c-kube-api-access-l6tzv\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg\" (UID: \"3e2db8bd-766f-4781-a4bb-1c09e35b116c\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg" Mar 18 16:53:55.741190 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.741159 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e11dfcd9-ee4a-4e94-9beb-94895e113198-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd\" (UID: \"e11dfcd9-ee4a-4e94-9beb-94895e113198\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd" Mar 18 16:53:55.741258 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.741187 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e11dfcd9-ee4a-4e94-9beb-94895e113198-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd\" (UID: \"e11dfcd9-ee4a-4e94-9beb-94895e113198\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd" Mar 18 16:53:55.749931 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.749902 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xprbb\" (UniqueName: \"kubernetes.io/projected/e11dfcd9-ee4a-4e94-9beb-94895e113198-kube-api-access-xprbb\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd\" (UID: \"e11dfcd9-ee4a-4e94-9beb-94895e113198\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd" Mar 18 16:53:55.768791 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.768764 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp"] Mar 18 16:53:55.771020 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:53:55.770985 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab965fdc_f9c6_4e4f_982f_5b26385e1ed6.slice/crio-5f1c73fad23787208a811686b2c100833b2abbd176c293fcd85c8f5116fbef62 WatchSource:0}: Error finding container 5f1c73fad23787208a811686b2c100833b2abbd176c293fcd85c8f5116fbef62: Status 404 returned error can't find the container with id 5f1c73fad23787208a811686b2c100833b2abbd176c293fcd85c8f5116fbef62 Mar 18 16:53:55.838994 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.838968 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd" Mar 18 16:53:55.843423 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.842308 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e2db8bd-766f-4781-a4bb-1c09e35b116c-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg\" (UID: \"3e2db8bd-766f-4781-a4bb-1c09e35b116c\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg" Mar 18 16:53:55.843423 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.842356 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6tzv\" (UniqueName: \"kubernetes.io/projected/3e2db8bd-766f-4781-a4bb-1c09e35b116c-kube-api-access-l6tzv\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg\" (UID: \"3e2db8bd-766f-4781-a4bb-1c09e35b116c\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg" Mar 18 16:53:55.843423 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.842415 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e2db8bd-766f-4781-a4bb-1c09e35b116c-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg\" (UID: \"3e2db8bd-766f-4781-a4bb-1c09e35b116c\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg" Mar 18 16:53:55.843423 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.842829 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e2db8bd-766f-4781-a4bb-1c09e35b116c-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg\" (UID: \"3e2db8bd-766f-4781-a4bb-1c09e35b116c\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg" Mar 18 16:53:55.843423 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.843111 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e2db8bd-766f-4781-a4bb-1c09e35b116c-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg\" (UID: \"3e2db8bd-766f-4781-a4bb-1c09e35b116c\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg" Mar 18 16:53:55.851930 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.851872 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6tzv\" (UniqueName: \"kubernetes.io/projected/3e2db8bd-766f-4781-a4bb-1c09e35b116c-kube-api-access-l6tzv\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg\" (UID: \"3e2db8bd-766f-4781-a4bb-1c09e35b116c\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg" Mar 18 16:53:55.882749 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.882721 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n"] Mar 18 16:53:55.885602 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:53:55.885573 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8376f4ed_27af_4e64_b2c7_ef984398bdc1.slice/crio-074c1761ce2bf2544569eb2f45255535681c7cac6d32afa2757b2237de9d7b5a WatchSource:0}: Error finding container 074c1761ce2bf2544569eb2f45255535681c7cac6d32afa2757b2237de9d7b5a: Status 404 returned error can't find the container with id 074c1761ce2bf2544569eb2f45255535681c7cac6d32afa2757b2237de9d7b5a Mar 18 16:53:55.940362 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.940318 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg" Mar 18 16:53:55.983481 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:55.983419 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd"] Mar 18 16:53:55.986825 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:53:55.986799 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode11dfcd9_ee4a_4e94_9beb_94895e113198.slice/crio-6ea3ed1743f12b187bacd506bbbb350029137f75883b257634580d33d8bde80e WatchSource:0}: Error finding container 6ea3ed1743f12b187bacd506bbbb350029137f75883b257634580d33d8bde80e: Status 404 returned error can't find the container with id 6ea3ed1743f12b187bacd506bbbb350029137f75883b257634580d33d8bde80e Mar 18 16:53:56.080392 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:56.080352 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg"] Mar 18 16:53:56.089065 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:53:56.089031 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e2db8bd_766f_4781_a4bb_1c09e35b116c.slice/crio-675fb0f92eed84ae29c6eeffb56db8eb26c6c51b97790d0b2cd2c60037c05cc5 WatchSource:0}: Error finding container 675fb0f92eed84ae29c6eeffb56db8eb26c6c51b97790d0b2cd2c60037c05cc5: Status 404 returned error can't find the container with id 675fb0f92eed84ae29c6eeffb56db8eb26c6c51b97790d0b2cd2c60037c05cc5 Mar 18 16:53:56.619173 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:56.619080 2573 generic.go:358] "Generic (PLEG): container finished" podID="ab965fdc-f9c6-4e4f-982f-5b26385e1ed6" containerID="486bdcef94cc3c811588574693ecd8dbcf320b07700175c91ae4db1bb68c3522" exitCode=0 Mar 18 16:53:56.619565 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:56.619169 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp" event={"ID":"ab965fdc-f9c6-4e4f-982f-5b26385e1ed6","Type":"ContainerDied","Data":"486bdcef94cc3c811588574693ecd8dbcf320b07700175c91ae4db1bb68c3522"} Mar 18 16:53:56.619565 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:56.619218 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp" event={"ID":"ab965fdc-f9c6-4e4f-982f-5b26385e1ed6","Type":"ContainerStarted","Data":"5f1c73fad23787208a811686b2c100833b2abbd176c293fcd85c8f5116fbef62"} Mar 18 16:53:56.620690 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:56.620628 2573 generic.go:358] "Generic (PLEG): container finished" podID="8376f4ed-27af-4e64-b2c7-ef984398bdc1" containerID="51a95a750b0ed13295d3c80bf314a2465590039a6ce116f28c7bfab6d7eb3dab" exitCode=0 Mar 18 16:53:56.620754 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:56.620705 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n" event={"ID":"8376f4ed-27af-4e64-b2c7-ef984398bdc1","Type":"ContainerDied","Data":"51a95a750b0ed13295d3c80bf314a2465590039a6ce116f28c7bfab6d7eb3dab"} Mar 18 16:53:56.620754 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:56.620743 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n" event={"ID":"8376f4ed-27af-4e64-b2c7-ef984398bdc1","Type":"ContainerStarted","Data":"074c1761ce2bf2544569eb2f45255535681c7cac6d32afa2757b2237de9d7b5a"} Mar 18 16:53:56.622289 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:56.622246 2573 generic.go:358] "Generic (PLEG): container finished" podID="3e2db8bd-766f-4781-a4bb-1c09e35b116c" containerID="911a097433ebc026200d31073f3dfaddc9b73890e162227d28f73e9d7167b756" exitCode=0 Mar 18 16:53:56.622377 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:56.622321 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg" event={"ID":"3e2db8bd-766f-4781-a4bb-1c09e35b116c","Type":"ContainerDied","Data":"911a097433ebc026200d31073f3dfaddc9b73890e162227d28f73e9d7167b756"} Mar 18 16:53:56.622377 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:56.622374 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg" event={"ID":"3e2db8bd-766f-4781-a4bb-1c09e35b116c","Type":"ContainerStarted","Data":"675fb0f92eed84ae29c6eeffb56db8eb26c6c51b97790d0b2cd2c60037c05cc5"} Mar 18 16:53:56.623770 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:56.623746 2573 generic.go:358] "Generic (PLEG): container finished" podID="e11dfcd9-ee4a-4e94-9beb-94895e113198" containerID="843c8bd2ee69dd2f3c7caf683879a37f0827a65b0bf49f4e5cd7cc5bee37ec64" exitCode=0 Mar 18 16:53:56.623862 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:56.623772 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd" event={"ID":"e11dfcd9-ee4a-4e94-9beb-94895e113198","Type":"ContainerDied","Data":"843c8bd2ee69dd2f3c7caf683879a37f0827a65b0bf49f4e5cd7cc5bee37ec64"} Mar 18 16:53:56.623862 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:56.623796 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd" event={"ID":"e11dfcd9-ee4a-4e94-9beb-94895e113198","Type":"ContainerStarted","Data":"6ea3ed1743f12b187bacd506bbbb350029137f75883b257634580d33d8bde80e"} Mar 18 16:53:58.637197 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:58.637160 2573 generic.go:358] "Generic (PLEG): container finished" podID="e11dfcd9-ee4a-4e94-9beb-94895e113198" containerID="29627000a253f4dfe4292cc290998f6b2b8490edf1058500938ebc534dd0bf90" exitCode=0 Mar 18 16:53:58.637629 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:58.637242 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd" event={"ID":"e11dfcd9-ee4a-4e94-9beb-94895e113198","Type":"ContainerDied","Data":"29627000a253f4dfe4292cc290998f6b2b8490edf1058500938ebc534dd0bf90"} Mar 18 16:53:58.639022 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:58.638970 2573 generic.go:358] "Generic (PLEG): container finished" podID="ab965fdc-f9c6-4e4f-982f-5b26385e1ed6" containerID="29ad252bf8218b84317333a218c1d0f44e994af41d2431b6e03253408f80ea9d" exitCode=0 Mar 18 16:53:58.639101 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:58.639061 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp" event={"ID":"ab965fdc-f9c6-4e4f-982f-5b26385e1ed6","Type":"ContainerDied","Data":"29ad252bf8218b84317333a218c1d0f44e994af41d2431b6e03253408f80ea9d"} Mar 18 16:53:58.640612 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:58.640587 2573 generic.go:358] "Generic (PLEG): container finished" podID="3e2db8bd-766f-4781-a4bb-1c09e35b116c" containerID="934c0209cfa9e9b329248934a649071513aeef7590731497a3c301af968d70b8" exitCode=0 Mar 18 16:53:58.640730 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:58.640659 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg" event={"ID":"3e2db8bd-766f-4781-a4bb-1c09e35b116c","Type":"ContainerDied","Data":"934c0209cfa9e9b329248934a649071513aeef7590731497a3c301af968d70b8"} Mar 18 16:53:59.646351 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:59.646315 2573 generic.go:358] "Generic (PLEG): container finished" podID="ab965fdc-f9c6-4e4f-982f-5b26385e1ed6" containerID="23b0befc7d434c92ff718acb311c59a54a3160ceb5594c8187fb8f4750ab72ae" exitCode=0 Mar 18 16:53:59.646792 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:59.646419 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp" event={"ID":"ab965fdc-f9c6-4e4f-982f-5b26385e1ed6","Type":"ContainerDied","Data":"23b0befc7d434c92ff718acb311c59a54a3160ceb5594c8187fb8f4750ab72ae"} Mar 18 16:53:59.648184 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:59.648161 2573 generic.go:358] "Generic (PLEG): container finished" podID="3e2db8bd-766f-4781-a4bb-1c09e35b116c" containerID="f4b0074cec6ddff83b3aa798761b82b6e04299f619677d92ff51c44856bce955" exitCode=0 Mar 18 16:53:59.648332 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:59.648244 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg" event={"ID":"3e2db8bd-766f-4781-a4bb-1c09e35b116c","Type":"ContainerDied","Data":"f4b0074cec6ddff83b3aa798761b82b6e04299f619677d92ff51c44856bce955"} Mar 18 16:53:59.650004 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:59.649978 2573 generic.go:358] "Generic (PLEG): container finished" podID="e11dfcd9-ee4a-4e94-9beb-94895e113198" containerID="09d1f2902723d7efd23bead1b9da4229ec8fe5c035b3552582278fba2a65fa13" exitCode=0 Mar 18 16:53:59.650099 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:53:59.650005 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd" event={"ID":"e11dfcd9-ee4a-4e94-9beb-94895e113198","Type":"ContainerDied","Data":"09d1f2902723d7efd23bead1b9da4229ec8fe5c035b3552582278fba2a65fa13"} Mar 18 16:54:00.811955 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:00.811916 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp" Mar 18 16:54:00.845727 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:00.845702 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg" Mar 18 16:54:00.848894 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:00.848875 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd" Mar 18 16:54:00.990892 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:00.990855 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e11dfcd9-ee4a-4e94-9beb-94895e113198-util\") pod \"e11dfcd9-ee4a-4e94-9beb-94895e113198\" (UID: \"e11dfcd9-ee4a-4e94-9beb-94895e113198\") " Mar 18 16:54:00.990892 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:00.990894 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e2db8bd-766f-4781-a4bb-1c09e35b116c-util\") pod \"3e2db8bd-766f-4781-a4bb-1c09e35b116c\" (UID: \"3e2db8bd-766f-4781-a4bb-1c09e35b116c\") " Mar 18 16:54:00.991149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:00.990928 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab965fdc-f9c6-4e4f-982f-5b26385e1ed6-bundle\") pod \"ab965fdc-f9c6-4e4f-982f-5b26385e1ed6\" (UID: \"ab965fdc-f9c6-4e4f-982f-5b26385e1ed6\") " Mar 18 16:54:00.991149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:00.990992 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xprbb\" (UniqueName: \"kubernetes.io/projected/e11dfcd9-ee4a-4e94-9beb-94895e113198-kube-api-access-xprbb\") pod \"e11dfcd9-ee4a-4e94-9beb-94895e113198\" (UID: \"e11dfcd9-ee4a-4e94-9beb-94895e113198\") " Mar 18 16:54:00.991149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:00.991018 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e11dfcd9-ee4a-4e94-9beb-94895e113198-bundle\") pod \"e11dfcd9-ee4a-4e94-9beb-94895e113198\" (UID: \"e11dfcd9-ee4a-4e94-9beb-94895e113198\") " Mar 18 16:54:00.991149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:00.991046 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd24z\" (UniqueName: \"kubernetes.io/projected/ab965fdc-f9c6-4e4f-982f-5b26385e1ed6-kube-api-access-qd24z\") pod \"ab965fdc-f9c6-4e4f-982f-5b26385e1ed6\" (UID: \"ab965fdc-f9c6-4e4f-982f-5b26385e1ed6\") " Mar 18 16:54:00.991149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:00.991075 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab965fdc-f9c6-4e4f-982f-5b26385e1ed6-util\") pod \"ab965fdc-f9c6-4e4f-982f-5b26385e1ed6\" (UID: \"ab965fdc-f9c6-4e4f-982f-5b26385e1ed6\") " Mar 18 16:54:00.991149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:00.991107 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6tzv\" (UniqueName: \"kubernetes.io/projected/3e2db8bd-766f-4781-a4bb-1c09e35b116c-kube-api-access-l6tzv\") pod \"3e2db8bd-766f-4781-a4bb-1c09e35b116c\" (UID: \"3e2db8bd-766f-4781-a4bb-1c09e35b116c\") " Mar 18 16:54:00.991149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:00.991141 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e2db8bd-766f-4781-a4bb-1c09e35b116c-bundle\") pod \"3e2db8bd-766f-4781-a4bb-1c09e35b116c\" (UID: \"3e2db8bd-766f-4781-a4bb-1c09e35b116c\") " Mar 18 16:54:00.991651 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:00.991622 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e11dfcd9-ee4a-4e94-9beb-94895e113198-bundle" (OuterVolumeSpecName: "bundle") pod "e11dfcd9-ee4a-4e94-9beb-94895e113198" (UID: "e11dfcd9-ee4a-4e94-9beb-94895e113198"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:54:00.991651 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:00.991632 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab965fdc-f9c6-4e4f-982f-5b26385e1ed6-bundle" (OuterVolumeSpecName: "bundle") pod "ab965fdc-f9c6-4e4f-982f-5b26385e1ed6" (UID: "ab965fdc-f9c6-4e4f-982f-5b26385e1ed6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:54:00.991980 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:00.991934 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e2db8bd-766f-4781-a4bb-1c09e35b116c-bundle" (OuterVolumeSpecName: "bundle") pod "3e2db8bd-766f-4781-a4bb-1c09e35b116c" (UID: "3e2db8bd-766f-4781-a4bb-1c09e35b116c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:54:00.993409 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:00.993378 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e11dfcd9-ee4a-4e94-9beb-94895e113198-kube-api-access-xprbb" (OuterVolumeSpecName: "kube-api-access-xprbb") pod "e11dfcd9-ee4a-4e94-9beb-94895e113198" (UID: "e11dfcd9-ee4a-4e94-9beb-94895e113198"). InnerVolumeSpecName "kube-api-access-xprbb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:54:00.993602 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:00.993581 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e2db8bd-766f-4781-a4bb-1c09e35b116c-kube-api-access-l6tzv" (OuterVolumeSpecName: "kube-api-access-l6tzv") pod "3e2db8bd-766f-4781-a4bb-1c09e35b116c" (UID: "3e2db8bd-766f-4781-a4bb-1c09e35b116c"). InnerVolumeSpecName "kube-api-access-l6tzv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:54:00.993763 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:00.993731 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab965fdc-f9c6-4e4f-982f-5b26385e1ed6-kube-api-access-qd24z" (OuterVolumeSpecName: "kube-api-access-qd24z") pod "ab965fdc-f9c6-4e4f-982f-5b26385e1ed6" (UID: "ab965fdc-f9c6-4e4f-982f-5b26385e1ed6"). InnerVolumeSpecName "kube-api-access-qd24z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:54:00.996144 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:00.996120 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e2db8bd-766f-4781-a4bb-1c09e35b116c-util" (OuterVolumeSpecName: "util") pod "3e2db8bd-766f-4781-a4bb-1c09e35b116c" (UID: "3e2db8bd-766f-4781-a4bb-1c09e35b116c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:54:00.996758 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:00.996737 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e11dfcd9-ee4a-4e94-9beb-94895e113198-util" (OuterVolumeSpecName: "util") pod "e11dfcd9-ee4a-4e94-9beb-94895e113198" (UID: "e11dfcd9-ee4a-4e94-9beb-94895e113198"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:54:01.066909 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:01.066867 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab965fdc-f9c6-4e4f-982f-5b26385e1ed6-util" (OuterVolumeSpecName: "util") pod "ab965fdc-f9c6-4e4f-982f-5b26385e1ed6" (UID: "ab965fdc-f9c6-4e4f-982f-5b26385e1ed6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:54:01.092798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:01.092762 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e11dfcd9-ee4a-4e94-9beb-94895e113198-util\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:54:01.092798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:01.092794 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e2db8bd-766f-4781-a4bb-1c09e35b116c-util\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:54:01.092798 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:01.092805 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab965fdc-f9c6-4e4f-982f-5b26385e1ed6-bundle\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:54:01.093044 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:01.092815 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xprbb\" (UniqueName: \"kubernetes.io/projected/e11dfcd9-ee4a-4e94-9beb-94895e113198-kube-api-access-xprbb\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:54:01.093044 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:01.092826 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e11dfcd9-ee4a-4e94-9beb-94895e113198-bundle\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:54:01.093044 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:01.092834 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qd24z\" (UniqueName: \"kubernetes.io/projected/ab965fdc-f9c6-4e4f-982f-5b26385e1ed6-kube-api-access-qd24z\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:54:01.093044 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:01.092843 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab965fdc-f9c6-4e4f-982f-5b26385e1ed6-util\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:54:01.093044 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:01.092853 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l6tzv\" (UniqueName: \"kubernetes.io/projected/3e2db8bd-766f-4781-a4bb-1c09e35b116c-kube-api-access-l6tzv\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:54:01.093044 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:01.092863 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e2db8bd-766f-4781-a4bb-1c09e35b116c-bundle\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:54:01.659273 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:01.659240 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd" event={"ID":"e11dfcd9-ee4a-4e94-9beb-94895e113198","Type":"ContainerDied","Data":"6ea3ed1743f12b187bacd506bbbb350029137f75883b257634580d33d8bde80e"} Mar 18 16:54:01.659443 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:01.659278 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ea3ed1743f12b187bacd506bbbb350029137f75883b257634580d33d8bde80e" Mar 18 16:54:01.659443 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:01.659249 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c308flrd" Mar 18 16:54:01.661122 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:01.661103 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp" Mar 18 16:54:01.661248 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:01.661145 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b533c66b650aa9f48b663f855ad059523d1aee816b6832a0a59c6c34chrnvp" event={"ID":"ab965fdc-f9c6-4e4f-982f-5b26385e1ed6","Type":"ContainerDied","Data":"5f1c73fad23787208a811686b2c100833b2abbd176c293fcd85c8f5116fbef62"} Mar 18 16:54:01.661248 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:01.661181 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f1c73fad23787208a811686b2c100833b2abbd176c293fcd85c8f5116fbef62" Mar 18 16:54:01.662885 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:01.662862 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg" event={"ID":"3e2db8bd-766f-4781-a4bb-1c09e35b116c","Type":"ContainerDied","Data":"675fb0f92eed84ae29c6eeffb56db8eb26c6c51b97790d0b2cd2c60037c05cc5"} Mar 18 16:54:01.662885 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:01.662881 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec887rwhg" Mar 18 16:54:01.663023 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:01.662891 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="675fb0f92eed84ae29c6eeffb56db8eb26c6c51b97790d0b2cd2c60037c05cc5" Mar 18 16:54:05.678636 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:05.678594 2573 generic.go:358] "Generic (PLEG): container finished" podID="8376f4ed-27af-4e64-b2c7-ef984398bdc1" containerID="e361b05593c0e7a038ec8fa7be4b878043d80d02c1da936a9986ca7a770d8cc9" exitCode=0 Mar 18 16:54:05.679146 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:05.678680 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n" event={"ID":"8376f4ed-27af-4e64-b2c7-ef984398bdc1","Type":"ContainerDied","Data":"e361b05593c0e7a038ec8fa7be4b878043d80d02c1da936a9986ca7a770d8cc9"} Mar 18 16:54:06.684577 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:06.684543 2573 generic.go:358] "Generic (PLEG): container finished" podID="8376f4ed-27af-4e64-b2c7-ef984398bdc1" containerID="9320cc05014565678b49dc072df858cc118441b2b5e323cb7a1c36d76761215a" exitCode=0 Mar 18 16:54:06.684961 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:06.684623 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n" event={"ID":"8376f4ed-27af-4e64-b2c7-ef984398bdc1","Type":"ContainerDied","Data":"9320cc05014565678b49dc072df858cc118441b2b5e323cb7a1c36d76761215a"} Mar 18 16:54:07.806473 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:07.806451 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n" Mar 18 16:54:07.851274 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:07.851242 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8376f4ed-27af-4e64-b2c7-ef984398bdc1-bundle\") pod \"8376f4ed-27af-4e64-b2c7-ef984398bdc1\" (UID: \"8376f4ed-27af-4e64-b2c7-ef984398bdc1\") " Mar 18 16:54:07.851437 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:07.851303 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8dcn\" (UniqueName: \"kubernetes.io/projected/8376f4ed-27af-4e64-b2c7-ef984398bdc1-kube-api-access-f8dcn\") pod \"8376f4ed-27af-4e64-b2c7-ef984398bdc1\" (UID: \"8376f4ed-27af-4e64-b2c7-ef984398bdc1\") " Mar 18 16:54:07.851698 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:07.851673 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8376f4ed-27af-4e64-b2c7-ef984398bdc1-bundle" (OuterVolumeSpecName: "bundle") pod "8376f4ed-27af-4e64-b2c7-ef984398bdc1" (UID: "8376f4ed-27af-4e64-b2c7-ef984398bdc1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:54:07.853371 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:07.853346 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8376f4ed-27af-4e64-b2c7-ef984398bdc1-kube-api-access-f8dcn" (OuterVolumeSpecName: "kube-api-access-f8dcn") pod "8376f4ed-27af-4e64-b2c7-ef984398bdc1" (UID: "8376f4ed-27af-4e64-b2c7-ef984398bdc1"). InnerVolumeSpecName "kube-api-access-f8dcn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:54:07.952236 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:07.952139 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8376f4ed-27af-4e64-b2c7-ef984398bdc1-util\") pod \"8376f4ed-27af-4e64-b2c7-ef984398bdc1\" (UID: \"8376f4ed-27af-4e64-b2c7-ef984398bdc1\") " Mar 18 16:54:07.952382 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:07.952287 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f8dcn\" (UniqueName: \"kubernetes.io/projected/8376f4ed-27af-4e64-b2c7-ef984398bdc1-kube-api-access-f8dcn\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:54:07.952382 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:07.952298 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8376f4ed-27af-4e64-b2c7-ef984398bdc1-bundle\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:54:07.956533 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:07.956490 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8376f4ed-27af-4e64-b2c7-ef984398bdc1-util" (OuterVolumeSpecName: "util") pod "8376f4ed-27af-4e64-b2c7-ef984398bdc1" (UID: "8376f4ed-27af-4e64-b2c7-ef984398bdc1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:54:08.052783 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:08.052749 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8376f4ed-27af-4e64-b2c7-ef984398bdc1-util\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:54:08.692819 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:08.692783 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n" event={"ID":"8376f4ed-27af-4e64-b2c7-ef984398bdc1","Type":"ContainerDied","Data":"074c1761ce2bf2544569eb2f45255535681c7cac6d32afa2757b2237de9d7b5a"} Mar 18 16:54:08.692819 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:08.692824 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="074c1761ce2bf2544569eb2f45255535681c7cac6d32afa2757b2237de9d7b5a" Mar 18 16:54:08.693078 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:08.692832 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bdh99n" Mar 18 16:54:19.833355 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833302 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-xfdrq"] Mar 18 16:54:19.833788 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833673 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8376f4ed-27af-4e64-b2c7-ef984398bdc1" containerName="util" Mar 18 16:54:19.833788 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833686 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8376f4ed-27af-4e64-b2c7-ef984398bdc1" containerName="util" Mar 18 16:54:19.833788 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833694 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab965fdc-f9c6-4e4f-982f-5b26385e1ed6" containerName="util" Mar 18 16:54:19.833788 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833699 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab965fdc-f9c6-4e4f-982f-5b26385e1ed6" containerName="util" Mar 18 16:54:19.833788 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833706 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e2db8bd-766f-4781-a4bb-1c09e35b116c" containerName="util" Mar 18 16:54:19.833788 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833713 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2db8bd-766f-4781-a4bb-1c09e35b116c" containerName="util" Mar 18 16:54:19.833788 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833730 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e11dfcd9-ee4a-4e94-9beb-94895e113198" containerName="util" Mar 18 16:54:19.833788 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833735 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e11dfcd9-ee4a-4e94-9beb-94895e113198" containerName="util" Mar 18 16:54:19.833788 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833741 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8376f4ed-27af-4e64-b2c7-ef984398bdc1" containerName="extract" Mar 18 16:54:19.833788 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833746 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8376f4ed-27af-4e64-b2c7-ef984398bdc1" containerName="extract" Mar 18 16:54:19.833788 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833754 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e11dfcd9-ee4a-4e94-9beb-94895e113198" containerName="pull" Mar 18 16:54:19.833788 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833759 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e11dfcd9-ee4a-4e94-9beb-94895e113198" containerName="pull" Mar 18 16:54:19.833788 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833765 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e2db8bd-766f-4781-a4bb-1c09e35b116c" containerName="pull" Mar 18 16:54:19.833788 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833770 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2db8bd-766f-4781-a4bb-1c09e35b116c" containerName="pull" Mar 18 16:54:19.833788 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833775 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab965fdc-f9c6-4e4f-982f-5b26385e1ed6" containerName="pull" Mar 18 16:54:19.833788 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833780 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab965fdc-f9c6-4e4f-982f-5b26385e1ed6" containerName="pull" Mar 18 16:54:19.833788 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833788 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e2db8bd-766f-4781-a4bb-1c09e35b116c" containerName="extract" Mar 18 16:54:19.833788 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833796 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2db8bd-766f-4781-a4bb-1c09e35b116c" containerName="extract" Mar 18 16:54:19.834412 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833805 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e11dfcd9-ee4a-4e94-9beb-94895e113198" containerName="extract" Mar 18 16:54:19.834412 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833811 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e11dfcd9-ee4a-4e94-9beb-94895e113198" containerName="extract" Mar 18 16:54:19.834412 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833817 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8376f4ed-27af-4e64-b2c7-ef984398bdc1" containerName="pull" Mar 18 16:54:19.834412 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833824 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8376f4ed-27af-4e64-b2c7-ef984398bdc1" containerName="pull" Mar 18 16:54:19.834412 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833831 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab965fdc-f9c6-4e4f-982f-5b26385e1ed6" containerName="extract" Mar 18 16:54:19.834412 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833836 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab965fdc-f9c6-4e4f-982f-5b26385e1ed6" containerName="extract" Mar 18 16:54:19.834412 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833888 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab965fdc-f9c6-4e4f-982f-5b26385e1ed6" containerName="extract" Mar 18 16:54:19.834412 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833896 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8376f4ed-27af-4e64-b2c7-ef984398bdc1" containerName="extract" Mar 18 16:54:19.834412 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833903 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e2db8bd-766f-4781-a4bb-1c09e35b116c" containerName="extract" Mar 18 16:54:19.834412 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.833911 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e11dfcd9-ee4a-4e94-9beb-94895e113198" containerName="extract" Mar 18 16:54:19.843260 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.843238 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-xfdrq" Mar 18 16:54:19.847052 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.847009 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Mar 18 16:54:19.847052 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.847013 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-l4x9p\"" Mar 18 16:54:19.847283 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.847069 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Mar 18 16:54:19.848693 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.848668 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-xfdrq"] Mar 18 16:54:19.934013 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:19.933972 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-884gs\" (UniqueName: \"kubernetes.io/projected/25c99cac-7f4e-4de0-bbc0-ecc93917f93f-kube-api-access-884gs\") pod \"authorino-operator-7587b89b76-xfdrq\" (UID: \"25c99cac-7f4e-4de0-bbc0-ecc93917f93f\") " pod="kuadrant-system/authorino-operator-7587b89b76-xfdrq" Mar 18 16:54:20.034427 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:20.034381 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-884gs\" (UniqueName: \"kubernetes.io/projected/25c99cac-7f4e-4de0-bbc0-ecc93917f93f-kube-api-access-884gs\") pod \"authorino-operator-7587b89b76-xfdrq\" (UID: \"25c99cac-7f4e-4de0-bbc0-ecc93917f93f\") " pod="kuadrant-system/authorino-operator-7587b89b76-xfdrq" Mar 18 16:54:20.042491 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:20.042464 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-884gs\" (UniqueName: \"kubernetes.io/projected/25c99cac-7f4e-4de0-bbc0-ecc93917f93f-kube-api-access-884gs\") pod \"authorino-operator-7587b89b76-xfdrq\" (UID: \"25c99cac-7f4e-4de0-bbc0-ecc93917f93f\") " pod="kuadrant-system/authorino-operator-7587b89b76-xfdrq" Mar 18 16:54:20.154896 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:20.154793 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-xfdrq" Mar 18 16:54:20.304037 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:20.304002 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-xfdrq"] Mar 18 16:54:20.306993 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:54:20.306963 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25c99cac_7f4e_4de0_bbc0_ecc93917f93f.slice/crio-cefe81d951ce72d6a00ab4c67e9af7558fdbf441fcf3b1f110630d1874479410 WatchSource:0}: Error finding container cefe81d951ce72d6a00ab4c67e9af7558fdbf441fcf3b1f110630d1874479410: Status 404 returned error can't find the container with id cefe81d951ce72d6a00ab4c67e9af7558fdbf441fcf3b1f110630d1874479410 Mar 18 16:54:20.743089 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:20.743051 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-xfdrq" event={"ID":"25c99cac-7f4e-4de0-bbc0-ecc93917f93f","Type":"ContainerStarted","Data":"cefe81d951ce72d6a00ab4c67e9af7558fdbf441fcf3b1f110630d1874479410"} Mar 18 16:54:22.117116 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:22.117076 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-97lx5"] Mar 18 16:54:22.128024 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:22.127988 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-97lx5" Mar 18 16:54:22.130986 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:22.130902 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-nc76l\"" Mar 18 16:54:22.132135 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:22.132097 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-97lx5"] Mar 18 16:54:22.150076 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:22.150044 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-994bb\" (UniqueName: \"kubernetes.io/projected/50bf4aec-07e0-49a0-8b8b-c8a6fc1aadc9-kube-api-access-994bb\") pod \"limitador-operator-controller-manager-c7fb4c8d5-97lx5\" (UID: \"50bf4aec-07e0-49a0-8b8b-c8a6fc1aadc9\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-97lx5" Mar 18 16:54:22.251342 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:22.251302 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-994bb\" (UniqueName: \"kubernetes.io/projected/50bf4aec-07e0-49a0-8b8b-c8a6fc1aadc9-kube-api-access-994bb\") pod \"limitador-operator-controller-manager-c7fb4c8d5-97lx5\" (UID: \"50bf4aec-07e0-49a0-8b8b-c8a6fc1aadc9\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-97lx5" Mar 18 16:54:22.264293 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:22.264263 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-994bb\" (UniqueName: \"kubernetes.io/projected/50bf4aec-07e0-49a0-8b8b-c8a6fc1aadc9-kube-api-access-994bb\") pod \"limitador-operator-controller-manager-c7fb4c8d5-97lx5\" (UID: \"50bf4aec-07e0-49a0-8b8b-c8a6fc1aadc9\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-97lx5" Mar 18 16:54:22.443061 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:22.443021 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-97lx5" Mar 18 16:54:22.726686 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:22.726663 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-97lx5"] Mar 18 16:54:22.728629 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:54:22.728600 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50bf4aec_07e0_49a0_8b8b_c8a6fc1aadc9.slice/crio-b4677193ab57461085fc9524c7d6a08687b024016b9f8adf410063a161212a63 WatchSource:0}: Error finding container b4677193ab57461085fc9524c7d6a08687b024016b9f8adf410063a161212a63: Status 404 returned error can't find the container with id b4677193ab57461085fc9524c7d6a08687b024016b9f8adf410063a161212a63 Mar 18 16:54:22.753264 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:22.753229 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-xfdrq" event={"ID":"25c99cac-7f4e-4de0-bbc0-ecc93917f93f","Type":"ContainerStarted","Data":"f34b8ddbb732a6d8dd4790a36001a61a57b07c0f4ce5a8e92cf7fd9977089474"} Mar 18 16:54:22.753421 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:22.753307 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-xfdrq" Mar 18 16:54:22.754570 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:22.754546 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-97lx5" event={"ID":"50bf4aec-07e0-49a0-8b8b-c8a6fc1aadc9","Type":"ContainerStarted","Data":"b4677193ab57461085fc9524c7d6a08687b024016b9f8adf410063a161212a63"} Mar 18 16:54:22.785423 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:22.785362 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-xfdrq" podStartSLOduration=1.477468246 podStartE2EDuration="3.78534665s" podCreationTimestamp="2026-03-18 16:54:19 +0000 UTC" firstStartedPulling="2026-03-18 16:54:20.309139256 +0000 UTC m=+588.024562233" lastFinishedPulling="2026-03-18 16:54:22.617017648 +0000 UTC m=+590.332440637" observedRunningTime="2026-03-18 16:54:22.783504417 +0000 UTC m=+590.498927415" watchObservedRunningTime="2026-03-18 16:54:22.78534665 +0000 UTC m=+590.500769647" Mar 18 16:54:24.770255 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:24.770208 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-97lx5" event={"ID":"50bf4aec-07e0-49a0-8b8b-c8a6fc1aadc9","Type":"ContainerStarted","Data":"d5e21e7540fd305ea754e54dbd3f8c213a999727ee2667d0a2dcca063d974af9"} Mar 18 16:54:24.770641 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:24.770302 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-97lx5" Mar 18 16:54:24.793929 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:24.793874 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-97lx5" podStartSLOduration=1.322964527 podStartE2EDuration="2.793858151s" podCreationTimestamp="2026-03-18 16:54:22 +0000 UTC" firstStartedPulling="2026-03-18 16:54:22.730785736 +0000 UTC m=+590.446208712" lastFinishedPulling="2026-03-18 16:54:24.201679357 +0000 UTC m=+591.917102336" observedRunningTime="2026-03-18 16:54:24.791784085 +0000 UTC m=+592.507207084" watchObservedRunningTime="2026-03-18 16:54:24.793858151 +0000 UTC m=+592.509281149" Mar 18 16:54:27.882677 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:27.882642 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6c4df77685-bmsdt"] Mar 18 16:54:27.886285 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:27.886267 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6c4df77685-bmsdt" Mar 18 16:54:27.889072 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:27.889044 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-lpndj\"" Mar 18 16:54:27.889799 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:27.889775 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3e363269-fc50-4b20-a57f-e5b79da02a1d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6c4df77685-bmsdt\" (UID: \"3e363269-fc50-4b20-a57f-e5b79da02a1d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6c4df77685-bmsdt" Mar 18 16:54:27.889930 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:27.889848 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wjct\" (UniqueName: \"kubernetes.io/projected/3e363269-fc50-4b20-a57f-e5b79da02a1d-kube-api-access-8wjct\") pod \"kuadrant-operator-controller-manager-6c4df77685-bmsdt\" (UID: \"3e363269-fc50-4b20-a57f-e5b79da02a1d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6c4df77685-bmsdt" Mar 18 16:54:27.896507 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:27.896474 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6c4df77685-bmsdt"] Mar 18 16:54:27.991162 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:27.991113 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3e363269-fc50-4b20-a57f-e5b79da02a1d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6c4df77685-bmsdt\" (UID: \"3e363269-fc50-4b20-a57f-e5b79da02a1d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6c4df77685-bmsdt" Mar 18 16:54:27.991162 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:27.991166 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wjct\" (UniqueName: \"kubernetes.io/projected/3e363269-fc50-4b20-a57f-e5b79da02a1d-kube-api-access-8wjct\") pod \"kuadrant-operator-controller-manager-6c4df77685-bmsdt\" (UID: \"3e363269-fc50-4b20-a57f-e5b79da02a1d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6c4df77685-bmsdt" Mar 18 16:54:27.991580 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:27.991558 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3e363269-fc50-4b20-a57f-e5b79da02a1d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6c4df77685-bmsdt\" (UID: \"3e363269-fc50-4b20-a57f-e5b79da02a1d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6c4df77685-bmsdt" Mar 18 16:54:27.999415 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:27.999391 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wjct\" (UniqueName: \"kubernetes.io/projected/3e363269-fc50-4b20-a57f-e5b79da02a1d-kube-api-access-8wjct\") pod \"kuadrant-operator-controller-manager-6c4df77685-bmsdt\" (UID: \"3e363269-fc50-4b20-a57f-e5b79da02a1d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6c4df77685-bmsdt" Mar 18 16:54:28.197430 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:28.197392 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6c4df77685-bmsdt" Mar 18 16:54:28.336530 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:54:28.336361 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e363269_fc50_4b20_a57f_e5b79da02a1d.slice/crio-2b72161664b30580dbcdae2fbcbfb71d23d5cbb6710a6a025dded0fd48267ef1 WatchSource:0}: Error finding container 2b72161664b30580dbcdae2fbcbfb71d23d5cbb6710a6a025dded0fd48267ef1: Status 404 returned error can't find the container with id 2b72161664b30580dbcdae2fbcbfb71d23d5cbb6710a6a025dded0fd48267ef1 Mar 18 16:54:28.337760 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:28.337732 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6c4df77685-bmsdt"] Mar 18 16:54:28.786738 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:28.786698 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6c4df77685-bmsdt" event={"ID":"3e363269-fc50-4b20-a57f-e5b79da02a1d","Type":"ContainerStarted","Data":"2b72161664b30580dbcdae2fbcbfb71d23d5cbb6710a6a025dded0fd48267ef1"} Mar 18 16:54:32.798590 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:32.795934 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qzdnc_c6018dac-2f72-43d9-b554-dffe8cf976c4/ovn-acl-logging/0.log" Mar 18 16:54:32.798590 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:32.796019 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qzdnc_c6018dac-2f72-43d9-b554-dffe8cf976c4/ovn-acl-logging/0.log" Mar 18 16:54:32.807912 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:32.807873 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6c4df77685-bmsdt" event={"ID":"3e363269-fc50-4b20-a57f-e5b79da02a1d","Type":"ContainerStarted","Data":"c493784c252d67daca469653dff2c11da394cf445d309fd852e95128b201610b"} Mar 18 16:54:32.808118 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:32.807976 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6c4df77685-bmsdt" Mar 18 16:54:32.829822 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:32.829757 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6c4df77685-bmsdt" podStartSLOduration=1.952804776 podStartE2EDuration="5.829740789s" podCreationTimestamp="2026-03-18 16:54:27 +0000 UTC" firstStartedPulling="2026-03-18 16:54:28.338980779 +0000 UTC m=+596.054403756" lastFinishedPulling="2026-03-18 16:54:32.215916793 +0000 UTC m=+599.931339769" observedRunningTime="2026-03-18 16:54:32.828453069 +0000 UTC m=+600.543876068" watchObservedRunningTime="2026-03-18 16:54:32.829740789 +0000 UTC m=+600.545163880" Mar 18 16:54:33.767208 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:33.767169 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-xfdrq" Mar 18 16:54:35.606577 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:35.606539 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-tkvbt"] Mar 18 16:54:35.610625 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:35.610601 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-tkvbt" Mar 18 16:54:35.613072 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:35.613050 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-mgks8\"" Mar 18 16:54:35.617385 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:35.617359 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-tkvbt"] Mar 18 16:54:35.663479 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:35.663440 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4fjl\" (UniqueName: \"kubernetes.io/projected/f79abbca-eae2-447a-9a57-35f67b807f2a-kube-api-access-s4fjl\") pod \"authorino-79cbc94b89-tkvbt\" (UID: \"f79abbca-eae2-447a-9a57-35f67b807f2a\") " pod="kuadrant-system/authorino-79cbc94b89-tkvbt" Mar 18 16:54:35.764774 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:35.764742 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4fjl\" (UniqueName: \"kubernetes.io/projected/f79abbca-eae2-447a-9a57-35f67b807f2a-kube-api-access-s4fjl\") pod \"authorino-79cbc94b89-tkvbt\" (UID: \"f79abbca-eae2-447a-9a57-35f67b807f2a\") " pod="kuadrant-system/authorino-79cbc94b89-tkvbt" Mar 18 16:54:35.774450 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:35.774418 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4fjl\" (UniqueName: \"kubernetes.io/projected/f79abbca-eae2-447a-9a57-35f67b807f2a-kube-api-access-s4fjl\") pod \"authorino-79cbc94b89-tkvbt\" (UID: \"f79abbca-eae2-447a-9a57-35f67b807f2a\") " pod="kuadrant-system/authorino-79cbc94b89-tkvbt" Mar 18 16:54:35.775765 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:35.775742 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-97lx5" Mar 18 16:54:35.921420 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:35.921328 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-tkvbt" Mar 18 16:54:36.047715 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:36.047681 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-tkvbt"] Mar 18 16:54:36.051035 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:54:36.051007 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf79abbca_eae2_447a_9a57_35f67b807f2a.slice/crio-6e8429912b116313af77d9c0375ed57d4ef1e814d8ac901384d91bf5662b3957 WatchSource:0}: Error finding container 6e8429912b116313af77d9c0375ed57d4ef1e814d8ac901384d91bf5662b3957: Status 404 returned error can't find the container with id 6e8429912b116313af77d9c0375ed57d4ef1e814d8ac901384d91bf5662b3957 Mar 18 16:54:36.826010 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:36.825969 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-tkvbt" event={"ID":"f79abbca-eae2-447a-9a57-35f67b807f2a","Type":"ContainerStarted","Data":"6e8429912b116313af77d9c0375ed57d4ef1e814d8ac901384d91bf5662b3957"} Mar 18 16:54:39.842396 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:39.842356 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-tkvbt" event={"ID":"f79abbca-eae2-447a-9a57-35f67b807f2a","Type":"ContainerStarted","Data":"eef85cb1a3a9b33b1a164d77f357caf935ba12296d9214074d9bbac7afcbed1a"} Mar 18 16:54:39.860394 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:39.860337 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-tkvbt" podStartSLOduration=1.49580736 podStartE2EDuration="4.860322184s" podCreationTimestamp="2026-03-18 16:54:35 +0000 UTC" firstStartedPulling="2026-03-18 16:54:36.052354347 +0000 UTC m=+603.767777323" lastFinishedPulling="2026-03-18 16:54:39.416869171 +0000 UTC m=+607.132292147" observedRunningTime="2026-03-18 16:54:39.859737913 +0000 UTC m=+607.575160936" watchObservedRunningTime="2026-03-18 16:54:39.860322184 +0000 UTC m=+607.575745184" Mar 18 16:54:43.815758 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:54:43.815727 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6c4df77685-bmsdt" Mar 18 16:55:03.919110 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:03.919058 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-tkvbt"] Mar 18 16:55:03.919664 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:03.919309 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-tkvbt" podUID="f79abbca-eae2-447a-9a57-35f67b807f2a" containerName="authorino" containerID="cri-o://eef85cb1a3a9b33b1a164d77f357caf935ba12296d9214074d9bbac7afcbed1a" gracePeriod=30 Mar 18 16:55:04.171561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:04.171475 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-tkvbt" Mar 18 16:55:04.312290 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:04.312249 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4fjl\" (UniqueName: \"kubernetes.io/projected/f79abbca-eae2-447a-9a57-35f67b807f2a-kube-api-access-s4fjl\") pod \"f79abbca-eae2-447a-9a57-35f67b807f2a\" (UID: \"f79abbca-eae2-447a-9a57-35f67b807f2a\") " Mar 18 16:55:04.314656 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:04.314619 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f79abbca-eae2-447a-9a57-35f67b807f2a-kube-api-access-s4fjl" (OuterVolumeSpecName: "kube-api-access-s4fjl") pod "f79abbca-eae2-447a-9a57-35f67b807f2a" (UID: "f79abbca-eae2-447a-9a57-35f67b807f2a"). InnerVolumeSpecName "kube-api-access-s4fjl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:55:04.412722 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:04.412684 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s4fjl\" (UniqueName: \"kubernetes.io/projected/f79abbca-eae2-447a-9a57-35f67b807f2a-kube-api-access-s4fjl\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:55:04.939107 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:04.939073 2573 generic.go:358] "Generic (PLEG): container finished" podID="f79abbca-eae2-447a-9a57-35f67b807f2a" containerID="eef85cb1a3a9b33b1a164d77f357caf935ba12296d9214074d9bbac7afcbed1a" exitCode=0 Mar 18 16:55:04.939569 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:04.939120 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-tkvbt" Mar 18 16:55:04.939569 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:04.939195 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-tkvbt" event={"ID":"f79abbca-eae2-447a-9a57-35f67b807f2a","Type":"ContainerDied","Data":"eef85cb1a3a9b33b1a164d77f357caf935ba12296d9214074d9bbac7afcbed1a"} Mar 18 16:55:04.939569 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:04.939225 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-tkvbt" event={"ID":"f79abbca-eae2-447a-9a57-35f67b807f2a","Type":"ContainerDied","Data":"6e8429912b116313af77d9c0375ed57d4ef1e814d8ac901384d91bf5662b3957"} Mar 18 16:55:04.939569 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:04.939241 2573 scope.go:117] "RemoveContainer" containerID="eef85cb1a3a9b33b1a164d77f357caf935ba12296d9214074d9bbac7afcbed1a" Mar 18 16:55:04.948809 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:04.948791 2573 scope.go:117] "RemoveContainer" containerID="eef85cb1a3a9b33b1a164d77f357caf935ba12296d9214074d9bbac7afcbed1a" Mar 18 16:55:04.949131 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:55:04.949105 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eef85cb1a3a9b33b1a164d77f357caf935ba12296d9214074d9bbac7afcbed1a\": container with ID starting with eef85cb1a3a9b33b1a164d77f357caf935ba12296d9214074d9bbac7afcbed1a not found: ID does not exist" containerID="eef85cb1a3a9b33b1a164d77f357caf935ba12296d9214074d9bbac7afcbed1a" Mar 18 16:55:04.949218 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:04.949138 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef85cb1a3a9b33b1a164d77f357caf935ba12296d9214074d9bbac7afcbed1a"} err="failed to get container status \"eef85cb1a3a9b33b1a164d77f357caf935ba12296d9214074d9bbac7afcbed1a\": rpc error: code = NotFound desc = could not find container \"eef85cb1a3a9b33b1a164d77f357caf935ba12296d9214074d9bbac7afcbed1a\": container with ID starting with eef85cb1a3a9b33b1a164d77f357caf935ba12296d9214074d9bbac7afcbed1a not found: ID does not exist" Mar 18 16:55:04.957561 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:04.957533 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-tkvbt"] Mar 18 16:55:04.960958 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:04.960918 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-tkvbt"] Mar 18 16:55:06.892921 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:06.892876 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f79abbca-eae2-447a-9a57-35f67b807f2a" path="/var/lib/kubelet/pods/f79abbca-eae2-447a-9a57-35f67b807f2a/volumes" Mar 18 16:55:10.529530 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.529494 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn"] Mar 18 16:55:10.531903 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.529837 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f79abbca-eae2-447a-9a57-35f67b807f2a" containerName="authorino" Mar 18 16:55:10.531903 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.529848 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79abbca-eae2-447a-9a57-35f67b807f2a" containerName="authorino" Mar 18 16:55:10.531903 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.529901 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f79abbca-eae2-447a-9a57-35f67b807f2a" containerName="authorino" Mar 18 16:55:10.532764 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.532747 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:10.544189 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.544158 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn"] Mar 18 16:55:10.666676 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.666625 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5d6a5228-b36b-4028-9609-42507be00403-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-6m2jn\" (UID: \"5d6a5228-b36b-4028-9609-42507be00403\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:10.666676 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.666678 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9r4p\" (UniqueName: \"kubernetes.io/projected/5d6a5228-b36b-4028-9609-42507be00403-kube-api-access-x9r4p\") pod \"istiod-openshift-gateway-55ff986f96-6m2jn\" (UID: \"5d6a5228-b36b-4028-9609-42507be00403\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:10.666972 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.666756 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/5d6a5228-b36b-4028-9609-42507be00403-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-6m2jn\" (UID: \"5d6a5228-b36b-4028-9609-42507be00403\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:10.666972 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.666855 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/5d6a5228-b36b-4028-9609-42507be00403-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-6m2jn\" (UID: \"5d6a5228-b36b-4028-9609-42507be00403\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:10.666972 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.666892 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/5d6a5228-b36b-4028-9609-42507be00403-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-6m2jn\" (UID: \"5d6a5228-b36b-4028-9609-42507be00403\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:10.666972 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.666915 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/5d6a5228-b36b-4028-9609-42507be00403-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-6m2jn\" (UID: \"5d6a5228-b36b-4028-9609-42507be00403\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:10.667151 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.666975 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5d6a5228-b36b-4028-9609-42507be00403-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-6m2jn\" (UID: \"5d6a5228-b36b-4028-9609-42507be00403\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:10.768143 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.768094 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5d6a5228-b36b-4028-9609-42507be00403-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-6m2jn\" (UID: \"5d6a5228-b36b-4028-9609-42507be00403\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:10.768143 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.768149 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9r4p\" (UniqueName: \"kubernetes.io/projected/5d6a5228-b36b-4028-9609-42507be00403-kube-api-access-x9r4p\") pod \"istiod-openshift-gateway-55ff986f96-6m2jn\" (UID: \"5d6a5228-b36b-4028-9609-42507be00403\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:10.768382 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.768194 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/5d6a5228-b36b-4028-9609-42507be00403-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-6m2jn\" (UID: \"5d6a5228-b36b-4028-9609-42507be00403\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:10.768382 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.768254 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/5d6a5228-b36b-4028-9609-42507be00403-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-6m2jn\" (UID: \"5d6a5228-b36b-4028-9609-42507be00403\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:10.768382 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.768287 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/5d6a5228-b36b-4028-9609-42507be00403-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-6m2jn\" (UID: \"5d6a5228-b36b-4028-9609-42507be00403\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:10.768382 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.768315 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/5d6a5228-b36b-4028-9609-42507be00403-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-6m2jn\" (UID: \"5d6a5228-b36b-4028-9609-42507be00403\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:10.768382 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.768350 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5d6a5228-b36b-4028-9609-42507be00403-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-6m2jn\" (UID: \"5d6a5228-b36b-4028-9609-42507be00403\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:10.769250 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.769201 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/5d6a5228-b36b-4028-9609-42507be00403-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-6m2jn\" (UID: \"5d6a5228-b36b-4028-9609-42507be00403\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:10.771144 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.771112 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5d6a5228-b36b-4028-9609-42507be00403-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-6m2jn\" (UID: \"5d6a5228-b36b-4028-9609-42507be00403\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:10.771144 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.771132 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/5d6a5228-b36b-4028-9609-42507be00403-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-6m2jn\" (UID: \"5d6a5228-b36b-4028-9609-42507be00403\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:10.771332 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.771170 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/5d6a5228-b36b-4028-9609-42507be00403-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-6m2jn\" (UID: \"5d6a5228-b36b-4028-9609-42507be00403\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:10.771332 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.771257 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/5d6a5228-b36b-4028-9609-42507be00403-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-6m2jn\" (UID: \"5d6a5228-b36b-4028-9609-42507be00403\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:10.780187 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.780113 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5d6a5228-b36b-4028-9609-42507be00403-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-6m2jn\" (UID: \"5d6a5228-b36b-4028-9609-42507be00403\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:10.781867 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.781833 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9r4p\" (UniqueName: \"kubernetes.io/projected/5d6a5228-b36b-4028-9609-42507be00403-kube-api-access-x9r4p\") pod \"istiod-openshift-gateway-55ff986f96-6m2jn\" (UID: \"5d6a5228-b36b-4028-9609-42507be00403\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:10.844714 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.844672 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:10.990104 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.990071 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn"] Mar 18 16:55:10.991417 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:55:10.991383 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d6a5228_b36b_4028_9609_42507be00403.slice/crio-7847d5a495b69ae471aad15af1d59e01eca91153f5c0d1d26a3b443bd277d1df WatchSource:0}: Error finding container 7847d5a495b69ae471aad15af1d59e01eca91153f5c0d1d26a3b443bd277d1df: Status 404 returned error can't find the container with id 7847d5a495b69ae471aad15af1d59e01eca91153f5c0d1d26a3b443bd277d1df Mar 18 16:55:10.993627 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.993586 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Mar 18 16:55:10.993742 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:10.993669 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Mar 18 16:55:11.973069 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:11.973003 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" event={"ID":"5d6a5228-b36b-4028-9609-42507be00403","Type":"ContainerStarted","Data":"62ea31dad57219558a9f72b116ca5d978c0aa06789c1d87842485a111ec62d2a"} Mar 18 16:55:11.973069 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:11.973066 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" event={"ID":"5d6a5228-b36b-4028-9609-42507be00403","Type":"ContainerStarted","Data":"7847d5a495b69ae471aad15af1d59e01eca91153f5c0d1d26a3b443bd277d1df"} Mar 18 16:55:11.973652 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:11.973128 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:12.004151 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:12.004095 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" podStartSLOduration=2.004080772 podStartE2EDuration="2.004080772s" podCreationTimestamp="2026-03-18 16:55:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:55:12.002870721 +0000 UTC m=+639.718293720" watchObservedRunningTime="2026-03-18 16:55:12.004080772 +0000 UTC m=+639.719503770" Mar 18 16:55:12.978774 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:12.978742 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-6m2jn" Mar 18 16:55:13.034057 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.034001 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv"] Mar 18 16:55:13.034345 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.034299 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" podUID="9c291e94-0e6d-40da-ab84-0be9016c5885" containerName="discovery" containerID="cri-o://d86b5b32197310d664280a907614f3c4f25e520877da3175188b49ccabb47a6e" gracePeriod=30 Mar 18 16:55:13.292553 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.292530 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:55:13.398215 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.398177 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-csr-ca-configmap\") pod \"9c291e94-0e6d-40da-ab84-0be9016c5885\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " Mar 18 16:55:13.398215 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.398219 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcwcv\" (UniqueName: \"kubernetes.io/projected/9c291e94-0e6d-40da-ab84-0be9016c5885-kube-api-access-rcwcv\") pod \"9c291e94-0e6d-40da-ab84-0be9016c5885\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " Mar 18 16:55:13.398474 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.398275 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/9c291e94-0e6d-40da-ab84-0be9016c5885-local-certs\") pod \"9c291e94-0e6d-40da-ab84-0be9016c5885\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " Mar 18 16:55:13.398474 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.398308 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-kubeconfig\") pod \"9c291e94-0e6d-40da-ab84-0be9016c5885\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " Mar 18 16:55:13.398474 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.398340 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/9c291e94-0e6d-40da-ab84-0be9016c5885-cacerts\") pod \"9c291e94-0e6d-40da-ab84-0be9016c5885\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " Mar 18 16:55:13.398474 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.398380 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-csr-dns-cert\") pod \"9c291e94-0e6d-40da-ab84-0be9016c5885\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " Mar 18 16:55:13.398474 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.398421 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-token\") pod \"9c291e94-0e6d-40da-ab84-0be9016c5885\" (UID: \"9c291e94-0e6d-40da-ab84-0be9016c5885\") " Mar 18 16:55:13.398730 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.398696 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "9c291e94-0e6d-40da-ab84-0be9016c5885" (UID: "9c291e94-0e6d-40da-ab84-0be9016c5885"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:55:13.401128 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.401081 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "9c291e94-0e6d-40da-ab84-0be9016c5885" (UID: "9c291e94-0e6d-40da-ab84-0be9016c5885"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:55:13.401387 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.401338 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c291e94-0e6d-40da-ab84-0be9016c5885-cacerts" (OuterVolumeSpecName: "cacerts") pod "9c291e94-0e6d-40da-ab84-0be9016c5885" (UID: "9c291e94-0e6d-40da-ab84-0be9016c5885"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:55:13.401387 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.401347 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "9c291e94-0e6d-40da-ab84-0be9016c5885" (UID: "9c291e94-0e6d-40da-ab84-0be9016c5885"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:55:13.401387 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.401367 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c291e94-0e6d-40da-ab84-0be9016c5885-local-certs" (OuterVolumeSpecName: "local-certs") pod "9c291e94-0e6d-40da-ab84-0be9016c5885" (UID: "9c291e94-0e6d-40da-ab84-0be9016c5885"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:55:13.401572 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.401387 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-token" (OuterVolumeSpecName: "istio-token") pod "9c291e94-0e6d-40da-ab84-0be9016c5885" (UID: "9c291e94-0e6d-40da-ab84-0be9016c5885"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:55:13.401620 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.401593 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c291e94-0e6d-40da-ab84-0be9016c5885-kube-api-access-rcwcv" (OuterVolumeSpecName: "kube-api-access-rcwcv") pod "9c291e94-0e6d-40da-ab84-0be9016c5885" (UID: "9c291e94-0e6d-40da-ab84-0be9016c5885"). InnerVolumeSpecName "kube-api-access-rcwcv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:55:13.499340 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.499305 2573 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-csr-ca-configmap\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:55:13.499340 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.499336 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rcwcv\" (UniqueName: \"kubernetes.io/projected/9c291e94-0e6d-40da-ab84-0be9016c5885-kube-api-access-rcwcv\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:55:13.499340 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.499347 2573 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/9c291e94-0e6d-40da-ab84-0be9016c5885-local-certs\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:55:13.499575 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.499357 2573 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-kubeconfig\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:55:13.499575 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.499372 2573 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/9c291e94-0e6d-40da-ab84-0be9016c5885-cacerts\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:55:13.499575 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.499384 2573 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-csr-dns-cert\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:55:13.499575 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.499395 2573 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9c291e94-0e6d-40da-ab84-0be9016c5885-istio-token\") on node \"ip-10-0-135-99.ec2.internal\" DevicePath \"\"" Mar 18 16:55:13.984489 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.984456 2573 generic.go:358] "Generic (PLEG): container finished" podID="9c291e94-0e6d-40da-ab84-0be9016c5885" containerID="d86b5b32197310d664280a907614f3c4f25e520877da3175188b49ccabb47a6e" exitCode=0 Mar 18 16:55:13.984914 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.984518 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" Mar 18 16:55:13.984914 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.984540 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" event={"ID":"9c291e94-0e6d-40da-ab84-0be9016c5885","Type":"ContainerDied","Data":"d86b5b32197310d664280a907614f3c4f25e520877da3175188b49ccabb47a6e"} Mar 18 16:55:13.984914 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.984579 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv" event={"ID":"9c291e94-0e6d-40da-ab84-0be9016c5885","Type":"ContainerDied","Data":"43115f1c85ad96576c500e01640034c4c9841ddf48ebb88e308cbe2a9ab8652d"} Mar 18 16:55:13.984914 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.984594 2573 scope.go:117] "RemoveContainer" containerID="d86b5b32197310d664280a907614f3c4f25e520877da3175188b49ccabb47a6e" Mar 18 16:55:13.994827 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.994798 2573 scope.go:117] "RemoveContainer" containerID="d86b5b32197310d664280a907614f3c4f25e520877da3175188b49ccabb47a6e" Mar 18 16:55:13.995117 ip-10-0-135-99 kubenswrapper[2573]: E0318 16:55:13.995098 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d86b5b32197310d664280a907614f3c4f25e520877da3175188b49ccabb47a6e\": container with ID starting with d86b5b32197310d664280a907614f3c4f25e520877da3175188b49ccabb47a6e not found: ID does not exist" containerID="d86b5b32197310d664280a907614f3c4f25e520877da3175188b49ccabb47a6e" Mar 18 16:55:13.995173 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:13.995126 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d86b5b32197310d664280a907614f3c4f25e520877da3175188b49ccabb47a6e"} err="failed to get container status \"d86b5b32197310d664280a907614f3c4f25e520877da3175188b49ccabb47a6e\": rpc error: code = NotFound desc = could not find container \"d86b5b32197310d664280a907614f3c4f25e520877da3175188b49ccabb47a6e\": container with ID starting with d86b5b32197310d664280a907614f3c4f25e520877da3175188b49ccabb47a6e not found: ID does not exist" Mar 18 16:55:14.009381 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:14.009344 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv"] Mar 18 16:55:14.014172 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:14.014139 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-tt2lv"] Mar 18 16:55:14.888645 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:14.888606 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c291e94-0e6d-40da-ab84-0be9016c5885" path="/var/lib/kubelet/pods/9c291e94-0e6d-40da-ab84-0be9016c5885/volumes" Mar 18 16:55:18.779538 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:18.779500 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-sxlqs"] Mar 18 16:55:18.780082 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:18.780062 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c291e94-0e6d-40da-ab84-0be9016c5885" containerName="discovery" Mar 18 16:55:18.780165 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:18.780085 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c291e94-0e6d-40da-ab84-0be9016c5885" containerName="discovery" Mar 18 16:55:18.780223 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:18.780194 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c291e94-0e6d-40da-ab84-0be9016c5885" containerName="discovery" Mar 18 16:55:18.783138 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:18.783115 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-sxlqs" Mar 18 16:55:18.785656 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:18.785632 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-zj86q\"" Mar 18 16:55:18.785796 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:18.785632 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Mar 18 16:55:18.786784 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:18.786744 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Mar 18 16:55:18.786892 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:18.786857 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Mar 18 16:55:18.793310 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:18.792294 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-sxlqs"] Mar 18 16:55:18.837711 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:18.837673 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db6dl\" (UniqueName: \"kubernetes.io/projected/4492c257-605d-4430-88d1-60c5bdee03a2-kube-api-access-db6dl\") pod \"llmisvc-controller-manager-68cc5db7c4-sxlqs\" (UID: \"4492c257-605d-4430-88d1-60c5bdee03a2\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-sxlqs" Mar 18 16:55:18.837871 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:18.837724 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4492c257-605d-4430-88d1-60c5bdee03a2-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-sxlqs\" (UID: \"4492c257-605d-4430-88d1-60c5bdee03a2\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-sxlqs" Mar 18 16:55:18.938684 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:18.938635 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-db6dl\" (UniqueName: \"kubernetes.io/projected/4492c257-605d-4430-88d1-60c5bdee03a2-kube-api-access-db6dl\") pod \"llmisvc-controller-manager-68cc5db7c4-sxlqs\" (UID: \"4492c257-605d-4430-88d1-60c5bdee03a2\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-sxlqs" Mar 18 16:55:18.938890 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:18.938697 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4492c257-605d-4430-88d1-60c5bdee03a2-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-sxlqs\" (UID: \"4492c257-605d-4430-88d1-60c5bdee03a2\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-sxlqs" Mar 18 16:55:18.941466 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:18.941436 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4492c257-605d-4430-88d1-60c5bdee03a2-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-sxlqs\" (UID: \"4492c257-605d-4430-88d1-60c5bdee03a2\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-sxlqs" Mar 18 16:55:18.952577 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:18.952552 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-db6dl\" (UniqueName: \"kubernetes.io/projected/4492c257-605d-4430-88d1-60c5bdee03a2-kube-api-access-db6dl\") pod \"llmisvc-controller-manager-68cc5db7c4-sxlqs\" (UID: \"4492c257-605d-4430-88d1-60c5bdee03a2\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-sxlqs" Mar 18 16:55:19.097206 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:19.097112 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-sxlqs" Mar 18 16:55:19.226334 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:19.226306 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-sxlqs"] Mar 18 16:55:19.227913 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:55:19.227878 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4492c257_605d_4430_88d1_60c5bdee03a2.slice/crio-02e2614869e5b557b566bac0d727841e80a05e9db60fd255cb7d916f163ee5ea WatchSource:0}: Error finding container 02e2614869e5b557b566bac0d727841e80a05e9db60fd255cb7d916f163ee5ea: Status 404 returned error can't find the container with id 02e2614869e5b557b566bac0d727841e80a05e9db60fd255cb7d916f163ee5ea Mar 18 16:55:20.017396 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:20.017356 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-sxlqs" event={"ID":"4492c257-605d-4430-88d1-60c5bdee03a2","Type":"ContainerStarted","Data":"02e2614869e5b557b566bac0d727841e80a05e9db60fd255cb7d916f163ee5ea"} Mar 18 16:55:22.027750 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:22.027706 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-sxlqs" event={"ID":"4492c257-605d-4430-88d1-60c5bdee03a2","Type":"ContainerStarted","Data":"76b1f3bdaa604292e8c48c16bdfdf727f53731625e137311530a7ae3d43b6902"} Mar 18 16:55:22.028231 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:22.027759 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-sxlqs" Mar 18 16:55:22.043715 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:22.043661 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-sxlqs" podStartSLOduration=1.9999383499999999 podStartE2EDuration="4.043645564s" podCreationTimestamp="2026-03-18 16:55:18 +0000 UTC" firstStartedPulling="2026-03-18 16:55:19.22930606 +0000 UTC m=+646.944729037" lastFinishedPulling="2026-03-18 16:55:21.273013276 +0000 UTC m=+648.988436251" observedRunningTime="2026-03-18 16:55:22.042091377 +0000 UTC m=+649.757514375" watchObservedRunningTime="2026-03-18 16:55:22.043645564 +0000 UTC m=+649.759068561" Mar 18 16:55:53.034701 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:55:53.034671 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-sxlqs" Mar 18 16:57:23.511773 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:23.511736 2573 generic.go:358] "Generic (PLEG): container finished" podID="4492c257-605d-4430-88d1-60c5bdee03a2" containerID="76b1f3bdaa604292e8c48c16bdfdf727f53731625e137311530a7ae3d43b6902" exitCode=1 Mar 18 16:57:23.512243 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:23.511808 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-sxlqs" event={"ID":"4492c257-605d-4430-88d1-60c5bdee03a2","Type":"ContainerDied","Data":"76b1f3bdaa604292e8c48c16bdfdf727f53731625e137311530a7ae3d43b6902"} Mar 18 16:57:23.512243 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:23.512140 2573 scope.go:117] "RemoveContainer" containerID="76b1f3bdaa604292e8c48c16bdfdf727f53731625e137311530a7ae3d43b6902" Mar 18 16:57:23.512446 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:23.512432 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:57:24.517134 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:24.517098 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-sxlqs" event={"ID":"4492c257-605d-4430-88d1-60c5bdee03a2","Type":"ContainerStarted","Data":"7daadae7422eed21faddc16575e5181362aa4fb7ca1a7f2a487910a1d26a419a"} Mar 18 16:57:24.517558 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:24.517315 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-sxlqs" Mar 18 16:57:48.039668 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:48.039634 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-6m2jn_5d6a5228-b36b-4028-9609-42507be00403/discovery/0.log" Mar 18 16:57:49.911520 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:49.911429 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-6m2jn_5d6a5228-b36b-4028-9609-42507be00403/discovery/0.log" Mar 18 16:57:50.609596 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:50.609567 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-xfdrq_25c99cac-7f4e-4de0-bbc0-ecc93917f93f/manager/0.log" Mar 18 16:57:50.656292 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:50.656260 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6c4df77685-bmsdt_3e363269-fc50-4b20-a57f-e5b79da02a1d/manager/0.log" Mar 18 16:57:50.697270 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:50.697240 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-97lx5_50bf4aec-07e0-49a0-8b8b-c8a6fc1aadc9/manager/0.log" Mar 18 16:57:51.382186 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:51.382154 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-xfdrq_25c99cac-7f4e-4de0-bbc0-ecc93917f93f/manager/0.log" Mar 18 16:57:51.430824 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:51.430786 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6c4df77685-bmsdt_3e363269-fc50-4b20-a57f-e5b79da02a1d/manager/0.log" Mar 18 16:57:51.460527 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:51.460493 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-97lx5_50bf4aec-07e0-49a0-8b8b-c8a6fc1aadc9/manager/0.log" Mar 18 16:57:52.143111 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:52.143081 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-xfdrq_25c99cac-7f4e-4de0-bbc0-ecc93917f93f/manager/0.log" Mar 18 16:57:52.185909 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:52.185871 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6c4df77685-bmsdt_3e363269-fc50-4b20-a57f-e5b79da02a1d/manager/0.log" Mar 18 16:57:52.210688 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:52.210660 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-97lx5_50bf4aec-07e0-49a0-8b8b-c8a6fc1aadc9/manager/0.log" Mar 18 16:57:52.862494 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:52.862456 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-xfdrq_25c99cac-7f4e-4de0-bbc0-ecc93917f93f/manager/0.log" Mar 18 16:57:52.907036 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:52.906997 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6c4df77685-bmsdt_3e363269-fc50-4b20-a57f-e5b79da02a1d/manager/0.log" Mar 18 16:57:52.930812 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:52.930778 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-97lx5_50bf4aec-07e0-49a0-8b8b-c8a6fc1aadc9/manager/0.log" Mar 18 16:57:53.592364 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:53.592323 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-xfdrq_25c99cac-7f4e-4de0-bbc0-ecc93917f93f/manager/0.log" Mar 18 16:57:53.642634 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:53.642604 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6c4df77685-bmsdt_3e363269-fc50-4b20-a57f-e5b79da02a1d/manager/0.log" Mar 18 16:57:53.675174 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:53.675120 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-97lx5_50bf4aec-07e0-49a0-8b8b-c8a6fc1aadc9/manager/0.log" Mar 18 16:57:55.523551 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:55.523519 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-sxlqs" Mar 18 16:57:58.521623 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:58.521585 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-xkzjh_ac3a60cb-dba0-4585-a37d-0402db777ed0/global-pull-secret-syncer/0.log" Mar 18 16:57:58.557431 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:58.557404 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-jndjx_2f0c6c9c-5cbc-4d92-9fee-ad30b7cb88a6/konnectivity-agent/0.log" Mar 18 16:57:58.677052 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:57:58.677023 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-99.ec2.internal_ccad537f0fa817f43bb3d0d1231fcf27/haproxy/0.log" Mar 18 16:58:02.793244 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:02.793207 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-xfdrq_25c99cac-7f4e-4de0-bbc0-ecc93917f93f/manager/0.log" Mar 18 16:58:02.881548 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:02.881515 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6c4df77685-bmsdt_3e363269-fc50-4b20-a57f-e5b79da02a1d/manager/0.log" Mar 18 16:58:02.936956 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:02.936909 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-97lx5_50bf4aec-07e0-49a0-8b8b-c8a6fc1aadc9/manager/0.log" Mar 18 16:58:04.501771 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:04.501743 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zkc96_6020e2e0-83e5-49b8-a158-e98d1e6697a8/node-exporter/0.log" Mar 18 16:58:04.524954 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:04.524916 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zkc96_6020e2e0-83e5-49b8-a158-e98d1e6697a8/kube-rbac-proxy/0.log" Mar 18 16:58:04.548222 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:04.548201 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zkc96_6020e2e0-83e5-49b8-a158-e98d1e6697a8/init-textfile/0.log" Mar 18 16:58:04.870576 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:04.870488 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-6b948c769-n79l9_d3adf32b-df34-4e20-8641-d4af2dea277a/prometheus-operator/0.log" Mar 18 16:58:04.888553 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:04.888509 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-6b948c769-n79l9_d3adf32b-df34-4e20-8641-d4af2dea277a/kube-rbac-proxy/0.log" Mar 18 16:58:04.917447 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:04.917412 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-8444df798b-gnhf9_ba0971bb-c918-470a-aa1b-4c2fbcc0115a/prometheus-operator-admission-webhook/0.log" Mar 18 16:58:05.044788 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:05.044740 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67fdbb775-kzlfs_2386015b-9e91-4f64-b14e-90352af8aad6/thanos-query/0.log" Mar 18 16:58:05.073738 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:05.073711 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67fdbb775-kzlfs_2386015b-9e91-4f64-b14e-90352af8aad6/kube-rbac-proxy-web/0.log" Mar 18 16:58:05.106773 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:05.106737 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67fdbb775-kzlfs_2386015b-9e91-4f64-b14e-90352af8aad6/kube-rbac-proxy/0.log" Mar 18 16:58:05.136442 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:05.136371 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67fdbb775-kzlfs_2386015b-9e91-4f64-b14e-90352af8aad6/prom-label-proxy/0.log" Mar 18 16:58:05.157596 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:05.157563 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67fdbb775-kzlfs_2386015b-9e91-4f64-b14e-90352af8aad6/kube-rbac-proxy-rules/0.log" Mar 18 16:58:05.180654 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:05.180628 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67fdbb775-kzlfs_2386015b-9e91-4f64-b14e-90352af8aad6/kube-rbac-proxy-metrics/0.log" Mar 18 16:58:07.296500 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.296468 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-5b85974fd6-95vqb_0d492655-5985-4c64-b74b-d3d031ea8e6c/download-server/0.log" Mar 18 16:58:07.335036 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.334999 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k"] Mar 18 16:58:07.338740 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.338714 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k" Mar 18 16:58:07.341487 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.341453 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zsmk9\"/\"kube-root-ca.crt\"" Mar 18 16:58:07.342462 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.342442 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-zsmk9\"/\"default-dockercfg-vmgwf\"" Mar 18 16:58:07.342551 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.342471 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zsmk9\"/\"openshift-service-ca.crt\"" Mar 18 16:58:07.347528 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.347498 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k"] Mar 18 16:58:07.446331 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.446286 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p4v5\" (UniqueName: \"kubernetes.io/projected/5bc3f76e-78b6-4ab2-ab03-23666fed8498-kube-api-access-9p4v5\") pod \"perf-node-gather-daemonset-vnc7k\" (UID: \"5bc3f76e-78b6-4ab2-ab03-23666fed8498\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k" Mar 18 16:58:07.446505 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.446345 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5bc3f76e-78b6-4ab2-ab03-23666fed8498-proc\") pod \"perf-node-gather-daemonset-vnc7k\" (UID: \"5bc3f76e-78b6-4ab2-ab03-23666fed8498\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k" Mar 18 16:58:07.446505 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.446394 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5bc3f76e-78b6-4ab2-ab03-23666fed8498-sys\") pod \"perf-node-gather-daemonset-vnc7k\" (UID: \"5bc3f76e-78b6-4ab2-ab03-23666fed8498\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k" Mar 18 16:58:07.446505 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.446411 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5bc3f76e-78b6-4ab2-ab03-23666fed8498-lib-modules\") pod \"perf-node-gather-daemonset-vnc7k\" (UID: \"5bc3f76e-78b6-4ab2-ab03-23666fed8498\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k" Mar 18 16:58:07.446505 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.446431 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5bc3f76e-78b6-4ab2-ab03-23666fed8498-podres\") pod \"perf-node-gather-daemonset-vnc7k\" (UID: \"5bc3f76e-78b6-4ab2-ab03-23666fed8498\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k" Mar 18 16:58:07.547814 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.547729 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9p4v5\" (UniqueName: \"kubernetes.io/projected/5bc3f76e-78b6-4ab2-ab03-23666fed8498-kube-api-access-9p4v5\") pod \"perf-node-gather-daemonset-vnc7k\" (UID: \"5bc3f76e-78b6-4ab2-ab03-23666fed8498\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k" Mar 18 16:58:07.547814 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.547792 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5bc3f76e-78b6-4ab2-ab03-23666fed8498-proc\") pod \"perf-node-gather-daemonset-vnc7k\" (UID: \"5bc3f76e-78b6-4ab2-ab03-23666fed8498\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k" Mar 18 16:58:07.548035 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.547819 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5bc3f76e-78b6-4ab2-ab03-23666fed8498-sys\") pod \"perf-node-gather-daemonset-vnc7k\" (UID: \"5bc3f76e-78b6-4ab2-ab03-23666fed8498\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k" Mar 18 16:58:07.548035 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.547840 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5bc3f76e-78b6-4ab2-ab03-23666fed8498-lib-modules\") pod \"perf-node-gather-daemonset-vnc7k\" (UID: \"5bc3f76e-78b6-4ab2-ab03-23666fed8498\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k" Mar 18 16:58:07.548035 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.547862 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5bc3f76e-78b6-4ab2-ab03-23666fed8498-podres\") pod \"perf-node-gather-daemonset-vnc7k\" (UID: \"5bc3f76e-78b6-4ab2-ab03-23666fed8498\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k" Mar 18 16:58:07.548035 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.547969 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5bc3f76e-78b6-4ab2-ab03-23666fed8498-proc\") pod \"perf-node-gather-daemonset-vnc7k\" (UID: \"5bc3f76e-78b6-4ab2-ab03-23666fed8498\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k" Mar 18 16:58:07.548035 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.547967 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5bc3f76e-78b6-4ab2-ab03-23666fed8498-sys\") pod \"perf-node-gather-daemonset-vnc7k\" (UID: \"5bc3f76e-78b6-4ab2-ab03-23666fed8498\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k" Mar 18 16:58:07.548035 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.548033 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5bc3f76e-78b6-4ab2-ab03-23666fed8498-podres\") pod \"perf-node-gather-daemonset-vnc7k\" (UID: \"5bc3f76e-78b6-4ab2-ab03-23666fed8498\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k" Mar 18 16:58:07.548255 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.548033 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5bc3f76e-78b6-4ab2-ab03-23666fed8498-lib-modules\") pod \"perf-node-gather-daemonset-vnc7k\" (UID: \"5bc3f76e-78b6-4ab2-ab03-23666fed8498\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k" Mar 18 16:58:07.555882 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.555835 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p4v5\" (UniqueName: \"kubernetes.io/projected/5bc3f76e-78b6-4ab2-ab03-23666fed8498-kube-api-access-9p4v5\") pod \"perf-node-gather-daemonset-vnc7k\" (UID: \"5bc3f76e-78b6-4ab2-ab03-23666fed8498\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k" Mar 18 16:58:07.649868 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.649821 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k" Mar 18 16:58:07.777982 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:07.777958 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k"] Mar 18 16:58:07.779825 ip-10-0-135-99 kubenswrapper[2573]: W0318 16:58:07.779789 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5bc3f76e_78b6_4ab2_ab03_23666fed8498.slice/crio-aeea2d2253328248687a5f1d572f57be233117f61d9eae698190bd5560ce96ab WatchSource:0}: Error finding container aeea2d2253328248687a5f1d572f57be233117f61d9eae698190bd5560ce96ab: Status 404 returned error can't find the container with id aeea2d2253328248687a5f1d572f57be233117f61d9eae698190bd5560ce96ab Mar 18 16:58:08.488224 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:08.488182 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bxl54_5f10649a-4fba-40f6-9e10-02d8301f5e9e/dns/0.log" Mar 18 16:58:08.509623 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:08.509587 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bxl54_5f10649a-4fba-40f6-9e10-02d8301f5e9e/kube-rbac-proxy/0.log" Mar 18 16:58:08.632817 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:08.632772 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8dd92_ef1bc9f1-f14a-4cb6-8631-d7ab65c971c7/dns-node-resolver/0.log" Mar 18 16:58:08.691578 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:08.691542 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k" event={"ID":"5bc3f76e-78b6-4ab2-ab03-23666fed8498","Type":"ContainerStarted","Data":"8b195eb0544f533ed3e182b9faa8d81fe80cd27f36c7da0b09a953d3c31a9ae7"} Mar 18 16:58:08.691762 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:08.691586 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k" event={"ID":"5bc3f76e-78b6-4ab2-ab03-23666fed8498","Type":"ContainerStarted","Data":"aeea2d2253328248687a5f1d572f57be233117f61d9eae698190bd5560ce96ab"} Mar 18 16:58:08.691762 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:08.691618 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k" Mar 18 16:58:08.708530 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:08.708469 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k" podStartSLOduration=1.708453493 podStartE2EDuration="1.708453493s" podCreationTimestamp="2026-03-18 16:58:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:58:08.706390797 +0000 UTC m=+816.421813793" watchObservedRunningTime="2026-03-18 16:58:08.708453493 +0000 UTC m=+816.423876533" Mar 18 16:58:09.137170 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:09.137075 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fxsfb_eb98c083-6ae5-4745-a6be-ff841741f1f6/node-ca/0.log" Mar 18 16:58:10.002745 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:10.002711 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-6m2jn_5d6a5228-b36b-4028-9609-42507be00403/discovery/0.log" Mar 18 16:58:10.516390 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:10.516356 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-7zqf5_1a4fd5d2-e5ee-4db9-9545-6baa4c7e38b3/serve-healthcheck-canary/0.log" Mar 18 16:58:11.066775 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:11.066748 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fq5mw_278983c0-c182-4da5-a2f5-9f699fb47ede/kube-rbac-proxy/0.log" Mar 18 16:58:11.089375 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:11.089337 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fq5mw_278983c0-c182-4da5-a2f5-9f699fb47ede/exporter/0.log" Mar 18 16:58:11.111355 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:11.111327 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fq5mw_278983c0-c182-4da5-a2f5-9f699fb47ede/extractor/0.log" Mar 18 16:58:13.676832 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:13.676799 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-fv5dz_f8917ccf-5984-4b2b-880b-c489ba5944a6/openshift-lws-operator/0.log" Mar 18 16:58:14.241044 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:14.241006 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-sxlqs_4492c257-605d-4430-88d1-60c5bdee03a2/manager/1.log" Mar 18 16:58:14.242183 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:14.242159 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-sxlqs_4492c257-605d-4430-88d1-60c5bdee03a2/manager/0.log" Mar 18 16:58:14.706770 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:14.706737 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-vnc7k" Mar 18 16:58:19.325535 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:19.325496 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-866f46547-l4zpq_b5757757-50ec-49cc-93fb-20785bb506cb/kube-storage-version-migrator-operator/1.log" Mar 18 16:58:19.327369 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:19.327343 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-866f46547-l4zpq_b5757757-50ec-49cc-93fb-20785bb506cb/kube-storage-version-migrator-operator/0.log" Mar 18 16:58:20.388607 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:20.388576 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4npwk_7b009b7f-519d-4077-b100-93c7b9934af9/kube-multus-additional-cni-plugins/0.log" Mar 18 16:58:20.410631 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:20.410603 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4npwk_7b009b7f-519d-4077-b100-93c7b9934af9/egress-router-binary-copy/0.log" Mar 18 16:58:20.431644 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:20.431618 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4npwk_7b009b7f-519d-4077-b100-93c7b9934af9/cni-plugins/0.log" Mar 18 16:58:20.453104 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:20.453069 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4npwk_7b009b7f-519d-4077-b100-93c7b9934af9/bond-cni-plugin/0.log" Mar 18 16:58:20.475974 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:20.475927 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4npwk_7b009b7f-519d-4077-b100-93c7b9934af9/routeoverride-cni/0.log" Mar 18 16:58:20.496415 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:20.496384 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4npwk_7b009b7f-519d-4077-b100-93c7b9934af9/whereabouts-cni-bincopy/0.log" Mar 18 16:58:20.517435 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:20.517407 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4npwk_7b009b7f-519d-4077-b100-93c7b9934af9/whereabouts-cni/0.log" Mar 18 16:58:20.912019 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:20.911980 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zdh9l_6bd0541f-f19e-4cdf-b03e-55eabcf75d7e/kube-multus/0.log" Mar 18 16:58:21.035141 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:21.035104 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wtbdl_52f2a3f3-56d7-41f7-8bed-9e7229d96408/network-metrics-daemon/0.log" Mar 18 16:58:21.053571 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:21.053536 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wtbdl_52f2a3f3-56d7-41f7-8bed-9e7229d96408/kube-rbac-proxy/0.log" Mar 18 16:58:22.167854 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:22.167817 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qzdnc_c6018dac-2f72-43d9-b554-dffe8cf976c4/ovn-controller/0.log" Mar 18 16:58:22.184331 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:22.184299 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qzdnc_c6018dac-2f72-43d9-b554-dffe8cf976c4/ovn-acl-logging/0.log" Mar 18 16:58:22.191564 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:22.191510 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qzdnc_c6018dac-2f72-43d9-b554-dffe8cf976c4/ovn-acl-logging/1.log" Mar 18 16:58:22.217149 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:22.217112 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qzdnc_c6018dac-2f72-43d9-b554-dffe8cf976c4/kube-rbac-proxy-node/0.log" Mar 18 16:58:22.241392 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:22.241352 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qzdnc_c6018dac-2f72-43d9-b554-dffe8cf976c4/kube-rbac-proxy-ovn-metrics/0.log" Mar 18 16:58:22.258155 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:22.258127 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qzdnc_c6018dac-2f72-43d9-b554-dffe8cf976c4/northd/0.log" Mar 18 16:58:22.279357 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:22.279331 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qzdnc_c6018dac-2f72-43d9-b554-dffe8cf976c4/nbdb/0.log" Mar 18 16:58:22.300765 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:22.300738 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qzdnc_c6018dac-2f72-43d9-b554-dffe8cf976c4/sbdb/0.log" Mar 18 16:58:22.467549 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:22.467516 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qzdnc_c6018dac-2f72-43d9-b554-dffe8cf976c4/ovnkube-controller/0.log" Mar 18 16:58:23.844737 ip-10-0-135-99 kubenswrapper[2573]: I0318 16:58:23.844710 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-9smhv_8b451861-a208-430f-840a-bce654bef71f/network-check-target-container/0.log"